NGramSplitter Function Example | Teradata Vantage - NGramSplitter Example: Omit Accumulate - Teradata® Database

Database Analytic Functions

Product
Teradata® Database
Release Number
17.10
Published
July 2021
Language
English (United States)
Last Update
2021-07-28
dita:mapPath
Teradata_Vantage™___Advanced_SQL_Engine_Analytic_Functions.withLogo_upload_July2021/wnd1589838592459.ditamap
dita:ditavalPath
Teradata_Vantage™___Advanced_SQL_Engine_Analytic_Functions.withLogo_upload_July2021/ayr1485454803741.ditaval
dita:id
B035-1206
lifecycle
previous
Product Category
Teradata Vantage™

Input

The input table, paragraphs_input, contains sentences about commonly used machine learning techniques.

paragraphs_input
paraid paratopic paratext
1 Decision Trees Decision tree learning uses a decision tree as a predictive model which maps observations about an item to conclusions about the items target value. It is one of the predictive modeling approaches used in statistics, data mining and machine learning. Tree models where the target variable can take a finite set of values are called classification trees. In these tree structures, leaves represent class labels and branches represent conjunctions of features that lead to those class labels. Decision trees where the target variable can take continuous values (typically real numbers) are called regression trees.
2 Simple Regression In statistics, simple linear regression is the least squares estimator of a linear regression model with a single explanatory variable. In other words, simple linear regression fits a straight line through the set of n points in such a way that makes the sum of squared residuals of the model (that is, vertical distances between the points of the data set and the fitted line) as small as possible
... ... ...

SQL Call

SELECT * FROM NGramSplitter (
  ON paragraphs_input
  USING
  TextColumn ('paratext')
  Grams ('4-6')
  OutputTotalGramCount ('true')
) AS dt;

Output

paraid paratopic ngram n frequency totalcnt
1 Decision Trees decision tree learning uses 4 1 73
1 Decision Trees decision tree learning uses a 5 1 66
1 Decision Trees decision tree learning uses a decision 6 1 60
1 Decision Trees tree learning uses a 4 1 73
1 Decision Trees tree learning uses a decision 5 1 66
1 Decision Trees tree learning uses a decision tree 6 1 60
1 Decision Trees learning uses a decision 4 1 73
1 Decision Trees learning uses a decision tree 5 1 66
1 Decision Trees learning uses a decision tree as 6 1 60
... ... ... ... ... ...