What is Apache Flink SQL And Table API?
Apache Flink Table API and SQL present the relational APIs to perform the stream and batch processing. Using Table API we can create the query using relational operators such as selection, filter, join as it is a language-integrated API and can be used with Scala, Java, Python whereas SQL uses the Apache Calcite for SQL implementation. Both types of relational operators are used to produce the same result on stream and batch data.
The following is a sample program of Apache Flink Table API and SQL.
// Please use ExecutionEnvironment for batch programs val Environment = StreamExecutionEnvironment.getExecutionEnvironment // Here we will create a table Environment val tab_Env = TableEnvironment.getTableEnvironment(Environment) // Now we will register the Table tab_Env.registerTable("tablename1", ...) // or tab_Env.registerTableSource("tablename2", ...) // or tab_Env.registerExternalCatalog("extCat", ...) // Here we will register the output Table tab_Env.registerTableSink("outputTable", ...); // We will create a Table using Table API query val tableapiResult = tab_Env.scan("tablename1").select(...) // We will create a Table using the SQL query val sqlResult = tab_Env.sqlQuery("SELECT ... FROM tablename2 ...") // Send the table API output to a tablesink tableapiResult.insertInto("outputofTable") // perfrom execution Environment.execute()