Thursday, December 12, 2013

IT Interpolation Telescope

It's become known here at the Big Brain Space Exploratory Initiative and its Ultra Space Administration that hidden elements remain within the celestial monstrosities of the Universe, worked in and between such things as gravatomic forces, bright matter events, dark matter, black holes and worm holes, moleculars, and quantum doors to the multiverse, for example.

However, extracting strange new hidden elemental forces and unforeseen objects is a task that requires special tools that don't yet exist. So we're out to create some of these tools, do the path finding implementations and see what materializes in the wake of science and technology.

The ET Extrapolation Telescope has inspired the IT or Interpolation Telescope, which is designed to reconstruct views in the universe that have sections of obscurity. Sections of obscurity may be related to obscuring matter and objects, antimatter, dark matter, objects partially in time flux and temporal transformations, quantum flux and effects, objects obscured by physical conditions like light, gravity, molecule aggregates, stages of evo, degrees of extinction, and numerous other effects.

The IT is designed to make the observation, then make the observation appear more complete by filling in the voids through processes of interpolation. Therefore, the IT is built upon interpolation algorithms. In the future, the mechanisms for the IT Telescope will be incorporated into the OTT Telescope, which contains the functions of other extremely powerful telescopes.

There are many types of interpolation and methods, such as algorithmic, piecewise constant, linear, polynomial, spline, Gaussian processes, rational functions, trigonometric, trigonometric polynomials, wavelets,  Whittaker–Shannon interpolation formula, multivariate, bilinear, bicubic, trilinear, sampling, curve fitting, approximation, least squares, barycentric coordinates, bilinear, Brahmagupta's interpolation formula, Extrapolation, Imputation (statistics), missing data, Newton– Cotes formulas, and simple rational approximation. Also refer to Quadratic/Cubic versus Linear Interpolations by Alain Brobecker, Interpolation Methods by Paul Bourke, Tweener transition types cheat sheet, Hermite Curve Interpolation by Nils Pipenbrinck, Tweening by Robert Penner, Easing Equations by Robert Penner and Sol Tutorials - Interpolation Tricks.

"In the mathematical field of numerical analysis, interpolation is a method of constructing new data points within the range of a discrete set of known data points.

In engineering and science, one often has a number of data points, obtained by sampling or experimentation, which represent the values of a function for a limited number of values of the independent variable. It is often required to interpolate (i.e. estimate) the value of that function for an intermediate value of the independent variable. This may be achieved by curve fitting or regression analysis.

A different problem which is closely related to interpolation is the approximation of a complicated function by a simple function. Suppose the formula for some given function is known, but too complex to evaluate efficiently. A few known data points from the original function can be used to create an interpolation based on a simpler function. Of course, when a simple function is used to estimate data points from the original, interpolation errors are usually present; however, depending on the problem domain and the interpolation method used, the gain in simplicity may be of greater value than the resultant loss in accuracy.

There is also another very different kind of interpolation in mathematics, namely the "interpolation of operators". The classical results about interpolation of operators are the Riesz–Thorin theorem and the Marcinkiewicz theorem. There are also many other subsequent results." Wikipedia

"In the domain of digital signal processing, the term interpolation refers to the process of converting a sampled digital signal (such as a sampled audio signal) to a higher sampling rate (Upsampling) using various digital filtering techniques (e.g., convolution with a frequency-limited impulse signal). In this application there is a specific requirement that the harmonic content of the original signal be preserved without creating aliased harmonic content of the original signal above the original Nyquist limit of the signal (i.e., above fs/2 of the original signal sample rate). An early and fairly elementary discussion on this subject can be found in Rabiner and Crochiere's book Multirate Digital Signal Processing." Wikipedia

In summary, the IT project has developed its own set of interpolative tooling to compliment the many imaging aspects of the IT Telescope, primarily using constructs for objects in space and time. Whether these IT constructs achieve the desired result remains to be seen as a lot of testing and calibration is required to further hone these instruments.