As TD_NumApply applies a specified numeric operator to the specified input table columns. You can apply the following numeric operators:
- EXP: The expoential function is a mathematical function denoted by f(x) = exp(x) = ex (where the argument x is written as an exponent). The term generally refers to the positive-valued function of a real variable, alhtough it can be extended to the complex numbers or generalized to other mathematical objects like matrixes or Lie algebras. You can use an exponential function to calculate the exponential growth or decay of a given set of data. For example, you can use exponential functions to calculate changes in population, loan interest charges, bacterial growth, radioactive decay or the spread of disease.
- LOG: Using logarithms makes the values on to a comparable scale and brings in “linearity” to the values. This function is denoted by f(x)=ln(x) which is a strictly increasing function. Two main reasons to use logarithmic scales in charts and graphs follow:
- To respond to skewness towards large values; specifically, cases in which one or a few points are much larger than the bulk of the data.
- To show percent change or multiplicative factors.
- SIGMOID: A sigmoid function is a bounded, differentiable, real function that is defined for all real input values and has a non-negative derivative at each point and exactly one inflection point. It is denoted by .The sigmoid function's ability to transform any real number to one between 0 and 1 is advantageous in data science and many other fields such as in deep learning as a non-linear activation function within neurons, and in artificial neural networks to allow the network to learn non-linear relationships between the data.
- SININV: Sine inverse or arcsine is the inverse of sine function. It is denoted by f(x) = sin-1(x).
- TANH: Tanh is the hyperbolic tangent function denoted by f(x)= tanh(x). The tanh function is mainly used for classification between two classes. Both tanh and logistic sigmoid activation functions are used in feed-forward nets. Use the hyperbolic tangent sigmoid function as transfer function for hidden neurons and the linear transfer function for output layer.