One of the most critical assets of any company is data. With the growth of automatic technology in various areas of business, such as finance, HR, administration, marketing and others, huge amounts of data are generated every day, resulting in vast volumes of data (or big data).
This calls for sophisticated handling and storage systems. This big data may, from time to time, be needed for crucial business decisions, such as following customers’ purchasing trends to maximize efficient intake of stock or the debtors’ payment trends to help budget.
To use this data, there is a need to analyse it and present it in comprehensible formats such as tables, graphs, plots and others . This is where data visualization developers are important to help design tools for data visualization that facilitates proper understanding and decision making.
The Big Question
With big data continually growing , there was an increasing need for a high-speed data transmission arrangement that could take care of the continually growing volume of information moved from place to place . The big question is : how can information transfer be made faster? There were large volumes of data, hence if the firm’s data transmission technology was not sufficiently swift, data manipulation could have been made difficult . It could have also been quite hard to use the latest technologies that used massive data streams, like for example, virtual reality.
It was then discovered that fibre-optic communication was the best solution, owing to its higher speed of data transfer over longer distances.
Researchers claimed a new world record for data transmission over longer distances. Data was transmitted to and fro at a combined speed of 186Gbps (gigabits per second), which is an impressively fast speed for the movement of e two million gigabytes of data in a day. This was to be a front-runner technology for networks with an average transfer speed of 100 Gbps. As such, it would be easier to share scientific research.
The tests were based on the sending and receipt of data between the University of Victoria in Victoria, British Columbia, and the Washington State Convention Centre in Seattle. The team managed to have two-way data transferal rates of 186Gbps, beating their earlier record of 119Gbps set in 2009. It was observed that the highest speed of data recorded in a single direction was 98 Gbps. In comparison , the fastest speed offered to the public was a 1.5Gbps broadband connection that Virgin Media in East London had carried out the trials.
The distance covered was about 131 miles (212km) and used the most recent optical equipment, highly tuned servers and operated over a 100Gbps circuit. In another demonstration of more significant magnitude, researchers moved enormous volumes of data between a booth situated at the SuperComputing 2011 conference in Seattle and other places in the US, South Korea and Brazil.
The tests mobilised great minds in the field – physicists, network engineers and computer scientists from several top institutions such as Florida International University, the University of Michigan, the California Institute of Technology, the University of Victoria and the European Centre for Nuclear Research (CERN).
A professor of Physics who led the physicists’ team said that their team and its partners were in the process of displaying how data would be handled and transmitted in the future.
The importance of the Research
From the researchers’ point of view , the achievements were helpful in creating new ways to transfer the large amounts of data that were being carried via fibre-optic networks from one continent to another.
One of the professors in the research team emphasised how having the tools for moving data would help them to engage in achievable visions. It also helped them have a brighter idea of the future of data transmission, which, for many others, was still inconceivable.
Quick transmission of data was especially critical for the sharing of enormous scientific projects like the Large Hadron Collider (LHC). The LHC hit the headlines when scientists announced that they had possibly sighted the indescribable Higgs boson particle, which is said to give all things in the universe their mass.
The volumes of data were expected to increase as the teams working on the vast scientific projects increased their efforts. Also, it was imperative for them to be in a position to share the data with other researchers from all over the world. Giving these scientists a chance to work on the LHC data was a primary goal of the project, since it aimed to solve some of the world’s largest mysteries.
There was a demonstration of data transmission at 100Gbps, which proved that the limits of network technology can be pushed; it is possible to move Petascale particle physics data, within hours, to any place around the world.
The efforts to increase data transfer speeds by using light-based telecommunications technologies has dramatically increased in recent years. In one of the latest tests, researchers set a new record for the rate of data movement by employing a laser, with a speed of 26 terabits per second. In other words, at this speed it would be possible to transfer, via a fibre-optics, the data of about 1000 high-definition DVDs within a second.