Top 6 storage trends to watch out for

As technology With technology evolving, to drive various formats of data gaining importance, the Regional Director – Data Center Group (DCG), Lenovo Middle East, Richard Wilcox highlights the top 6 data storage trends to watch out for.

Richard Wilcox Data Center Group (DCG) Regional Director, Lenovo Middle East

Two things are evident. Data has become increasingly valuable as a resource and the amount of data being generated continues to grow exponentially. So, where will we store all this? Traditionally most data could be stored in a database, today however, new technologies are producing data formats not suitable for this environment, including sensor data, video footage, and other types of unstructured data. This rapid development will have an enormous impact on the storage market and on individual companies. IT decision makers and managers must start planning for the future.

1. Standardising management
Every smart business should be looking to unlock the power of its data. But first, this data must be made more manageable. Here, conventional tools are inadequate. Powerful hardware and standardised data management are needed.

The aim should be to introduce as much standardisation as possible. For example, businesses should look to centralise the administration of existing storage systems, ideally through a single interface. Data will be more easily sorted, controlled, and utilised if it can be more universally understood.

2. Hybrid storage and tiering systems
Many companies use a combination of local storage and cloud platforms. When fast access to large amounts of data is required, the local provision of SAN or other storage systems is still essential. Less frequently used data can be stored in the cloud for backup and archiving purposes. To optimise how storage is allocated, tiering mechanisms are used to automatically decide where location data is stored.

3. Artificial Intelligence powered by fast storage
Another trend that will profoundly influence storage solutions is artificial intelligence. When large amounts of data come into play, especially during the machine/deep learning phase, the existing data is examined for certain characteristics by the AI system which is then “trained” accordingly.

Wherever GPU-based computing systems are used, the rapid exchange between the AI and the underlying storage unit plays a decisive role. Ultimately, the same lessons apply here: find the right mixture of local and cloud-based storage systems.

4. Local data centres for faster connection
Cloud providers are increasingly recognising that they need to deliver the fastest possible connection to corporate infrastructure. So new data centres such as Microsoft and Amazon are being built closer to the user’s location, helping to eliminate or temper, slow connection to the cloud server.

This also applies to smaller cloud providers which are much more decentralised and regional than the likes of Azure or AWS. In these examples, a strong internet connection is required but it can still easily be achieved, with the help of smaller, more local data centres. This type of regional provider then represents a healthy compromise in terms of cost and performance. These regional providers can often act as a high-speed connection points to public clouds to enable multi-cloud solutions.

5. Backup and recovery solutions must fit the requirements
With ever-growing data levels, the recovery of petabytes of lost data is more challenging than that of a gigabytes or terabytes. This is equally true for archiving large amounts of data, even though it is less time-critical than recovery. Here, advancements such as intelligent indexing, or the storage of meta-data, will play a crucial role because unstructured data such as video, should be easily located.

6. High-Performance Computing arrives in medium-sized businesses
Historically, HPCs were almost exclusively the domain of universities and state-owned computer centres. Soon enough, even medium-sized businesses will require HPC solutions to process the amount of data they will likely be generating.

As data volumes increase, HPCs will be necessary wherever computing and storage-intensive simulation applications are used. A large engineering office with highly complex calculations for example. It requires local and high-performance computing units for the calculation and visualisation of 3D objects. Without an HPC environment, the amount of data involved in such a process could either be extremely time-consuming or simply impossible to process.

What’s next?
Storage is undergoing significant change already. New advancements include object storage for improved indexing and metadata allocation, and storage-class memory for faster, latency-free access using smarter tiering mechanisms. Flash technology in the form of SSD components, will also continue to evolve, eventually replacing the classic hard disk and for performance, the NVMe protocol will begin to roll out on a much larger scale.

It’s certainly an exciting time to be in storage, with the next generation of storage already coming into play, and more innovation on the horizon.

Comments

Lost Password

Contact us

Do NOT follow this link or you will be banned from the site!