El siguiente ejemplo muestra cómo Spark SQL inicia una solicitud EXPLAIN para un conector de destino de Teradata.
scala> ForeignServer.sql("explain select name from default.nn1 where number = 10") +------------------------------------------------------------------------------------+ |plan +------------------------------------------------------------------------------------+ |== Physical Plan == *Project [name#25] +- *Filter (isnotnull(number#24) && (number#24 = 10)) +- *Scan com.teradata.querygrid.qgc.spark.QGRelation@4ec78008 default.nn1[name#25,number#24] PushedFilters: [IsNotNull(number), EqualTo(number,10)], ReadSchema: struct<name:string>| +------------------------------------------------------------------------------------+ scala> ForeignServer.sql("explain insert into default.nn1 select * from players2") +------------------------------------------------------------------------------------+ |plan +------------------------------------------------------------------------------------+ |== Physical Plan == ExecutedCommand +- InsertIntoDataSourceCommand Relation[number#24,name#25] com.teradata.querygrid.qgc.spark.QGRelation@4ec78008, OverwriteOptions(false,Map()) +- MetastoreRelation default, players2 +------------------------------------------------------------------------------------+