top of page

"While everyone's focused on the toy elephant, the eight hundred pound gorilla goes unnoticed"

Updated: Sep 10, 2018



In 2012, we created 2.5 quintillion bytes of data every day. And it's growing. This is creating a massive challenge in storing and moving all this Big Data."


Storing big data in the cloud is easy, getting it there is the hard part.


In 2012, we created 2.5 quintillion bytes of data every day. And it's growing. This is creating a massive challenge in storing and moving all this Big Data. What's more, Big Data is fuelling the growth of cloud-based storage. Cloud storage has its advantages in pay-as-you go utilities like billing, the elasticity it offers, multiple device compatibility and the list continues to grow along with innovations in cloud storage. So the world will be increasingly storing ever growing volumes of Big Data' in the cloud on different platforms including private, public, hybrid and elastic. The absolute volume that can be moved economically and in fact physically (i.e. latency) is limited by present upload and download speeds and by transfer protocols.


There have been several novel methods and approaches to move Big Data. They range from physical migration of data, e.g. moving hard disks on trucks, Gigabit Ethernet, Amazon's S3, capacity changing and routing algorithms that pushes data at 500 Gbit/sec. Newer players in these segments include Aspera and Data Expedition, offering patented bulk data transfer technology. Aspera claims to add a layer on top of TCP allowing content like video to be transmitted to the cloud more quickly. Significant performance improvements making up to 100% use of available bandwidth are also claimed. Fujitsu and Unisys have their own versions of bulk data transfer technology. Performance is measured in terms of 10-20 fold increases in upload speeds. Storage vendors including EMC, HP and Cisco offer only standard compression algorithms. These market leaders are offering no more than a 50-fold performance improvement on pre-compression data volumes using their technologies.


But what if there was a way to reduce big data volumes by several orders of magnitude?


Patent-pending, Information Density Holography (IDH) is a ground breaking hyperfast Big Data access and movement technology which can do just that. It can reduce data volumes by at least two orders of magnitude. By reproducing the data holographically, IDH can radically reduce cloud hosting and storage requirements.


How does IDH work? It uses Fourier transforms and theories of Boltzmann-Shannon entropy equivalence to derive innovative algebraic encoding algorithms that reduce the volume of high-dimensional data, audio, image and video. By reducing the data to patterns (using a new and innovative kind of digital holography), it can be moved faster. Meta data extracted from the source enables faithful data reproduction and high fidelity. The greater the number of dimensions the larger the reduction. However, there is not always a need to reconstruct the data and it can be stored in that form e.g. archival data.


IDH requires highly parallel processing hardware and involves a three stage process. The first, a technique used in topological data analysis called principal component analysis provides data volume reduction directly proportional to the number of dimensions within the data. In the second stage of the reduction process, algorithms convert the reduced data to a topological pattern, which in the third stage is turned into a digital hologram with further dimension and size reduction. A fourth floating-point compression stage provides additional reduction of over 90% in data sets with large number of dimensions.


IDH is a disruptive innovation that will create a new kind of storage market and value network, potentially displacing current data compression technologies over the next decade. It will help realize many applications that rely on real-time Big Data transmission such as business analytics, streaming 3D video, remote cybernetic control and robotics. IDH will impact technology and business in ways that the market has not foreseen with new and diverse applications with new consumers.


The impact that this could have in today's increasingly digital world is truly massive.


IDH could be integrated into massively parallel processing chipsets or placed in PCs and smartphones making it ubiquitous for any application requiring big data volumes from streaming media to business analytics to autonomous vehicles.


A clear benefit is in the video industry. Netflix for example, deals with hundreds of terabytes a month in streaming movies and TV programs. IDH will enable Netflix to perform batch transfers at average broadband upload speeds of 2.5Mbps. 500 Tbytes/month will take an acceptable 8 hours per day. With higher upload speeds like BT Infinity's 34Mbps this can be cut to just over 0.5 hr. per day. Download speeds at the present 100Mbps will be very fast indeed. The pixilated very high dimensionality of images and video also makes them suitable for IDH.


Telcos could offer smartphones with High Efficiency Video Coding (HEVC) compression technology to help improve image quality for full high-definition (HD) playback and reduce the data strain on networks making it a win-win for users and operators for users and operators.


Compared with MPEG-4 AVC, compressed video can be downloaded twice as fast, or higher-quality video can be streamed at conventional speed. Personal computer and tablet manufacturers could provide playback with real-time decoding of 4K UHD TV, which offers four times the resolution of full HD. New technologies such as 3D are becoming increasingly important as a differentiator and IDH will provide a solution for premium broadcasters looking to offer services above and beyond HD.


The Big Data industry is churning out some of the biggest innovations the world has ever seen and Information density holography could be one of the most innovative and disruptive yet.

33 views2 comments
bottom of page