1.0 - 8.00 - NGrams Example 1: Overlapping ('true'), TotalGramCount ('true') - Teradata Vantage

Teradata® Vantage Machine Learning Engine Analytic Function Reference

Product
Teradata Vantage
Release Number
1.0
8.00
Release Date
May 2019
Content Type
Programming Reference
Publication ID
B700-4003-098K
Language
English (United States)

Input

  • Input Table: paragraphs_input, which has paragraphs about common analytics topics (regression, decision Trees, and so on)
Input Table: paragraphs_input
paraid paratopic paratext
1 Decision Trees Decision tree learning uses a decision tree as a predictive model which maps observations about an item to conclusions about the items target value. It is one of the predictive modelling approaches used in statistics, data mining and machine learning. Tree models where the target variable can take a finite set of values are called classification trees. In these tree structures, leaves represent class labels and branches represent conjunctions of features that lead to those class labels. Decision trees where the target variable can take continuous values (typically real numbers) are called regression trees.
2 Simple Regression In statistics, simple linear regression is the least squares estimator of a linear regression model with a single explanatory variable. In other words, simple linear regression fits a straight line through the set of n points in such a way that makes the sum of squared residuals of the model (that is, vertical distances between the points of the data set and the fitted line) as small as possible
... ... ...

SQL Call

SELECT * FROM NGrams (
  ON paragraphs_input
  USING
  TextColumn ('paratext')
  Delimiter (' ')
  Grams ('4-6')
  OverLapping ('true')
  ToLowerCase ('true')
  Reset ('[.,?!]')
  Punctuation ('[`~#^&*()-]')
  TotalGramCount ('true')
  Accumulate ('paraid', 'paratopic')
) AS dt ORDER BY paraid, paratopic, ngram;

Output

paraid paratopic ngram n frequency totalcnt
1 Decision Trees decision tree learning uses 4 1 73
1 Decision Trees decision tree learning uses a 5 1 66
1 Decision Trees decision tree learning uses a decision 6 1 60
1 Decision Trees tree learning uses a 4 1 73
1 Decision Trees tree learning uses a decision 5 1 66
1 Decision Trees tree learning uses a decision tree 6 1 60
1 Decision Trees learning uses a decision 4 1 73
1 Decision Trees learning uses a decision tree 5 1 66
1 Decision Trees learning uses a decision tree as 6   60
... ... ... ... ... ...