Texas A&M University: New Algorithm Could Could Reduce Complexity of Big Data
March 16, 2021
March 16, 2021
COLLEGE STATION, Texas, March 16 (TNSRes) -- Texas A&M University issued the following news:
Whenever a scientific experiment is conducted, the results are turned into numbers, often producing huge datasets. In order to reduce the size of the data, computer programmers use algorithms that can find and extract the principal features that represent the most salient statistical properties. But many such algorithms cannot be applied directly to these large volumes of data.
Whenever a scientific experiment is conducted, the results are turned into numbers, often producing huge datasets. In order to reduce the size of the data, computer programmers use algorithms that can find and extract the principal features that represent the most salient statistical properties. But many such algorithms cannot be applied directly to these large volumes of data.