The Data Sources API operates at the table level by default, so one non-native table must be created for each remote table, unlike Teradata Foreign Servers and Presto Catalogs that operate at the database level. However, a Foreign Server Library has been included as part of the Spark SQL connector, which addresses some of the limitations and inconveniences when working with non-native tables. Teradata recommends that the Foreign Server Library be used to interact with the Spark SQL initiator and all Spark SQL initiator examples in this section are based on the Foreign Server Library. For complete details on the foreign server library, see Foreign Server Library API Reference for the Spark SQL Initiator Connector.The following steps provide an example of configuring a foreign server in order to use it with a Spark SQL-to-TargetConnector (where TargetConnector is any type of target connector):
- Set the link properties for the Spark SQL-to-Teradata link in the QueryGrid portlet. See Spark SQL Connector and Link Properties.
- Log on to Scala REPL, see Starting Scala REPL for more information.
Import the Foreign Server Library and create a foreign server object, for example:
scala> import tdqg.ForeignServer import tdqg.ForeignServer scala> val s1 = new ForeignServer("spark_to_teradata_link","active","fs1") s1: tdqg.ForeignServer = tdqg.ForeignServer@4eb73cc8
Use the foreign server to show remote schemas and verify the results, for example:
scala> s1.showSchemas +---------------+ |DATABASE_NAME | +---------------+ |default | |user1 | +---------------+