Since I’ve started as BDM for OpenStack in EMEA, I had the chance to overhear many interesting conversations. I’ve realized that many misperceptions around OpenStack and NetApp exists. This motivated me to record a series of three videos, which I will release over the comings weeks, to bust some of the most common myths. I sincerely hope that I can help to overcome these myths and clear the sky for open and honest discussions.Read more
Data is the new gold for the modern financial business, but the question of how to effectively mine, manage
and utilise that data has proved far more challenging. For the finance sector, the challenges are multiplied by
the sheer volume of data from billions of transactions. These billions of transactions hide the good, the bad
and the downright ugly. When you are faced with this unrelenting barrage of transactions how do you know
which is which?
For many years, forensic analytics has let you look back on historical data to help you:
These reactive activities are key in modern finance, however, today's bankers are also looking for a proactive
strategy applying continuous surveillance and screening of data on arrival. Machine learning is increasingly
used to drive operations deliver near real time results, and can tie in neatly with processes of compliance and
regulation. The benefits go far beyond the realm of forensic accounting, into offering business benefits through
predictive analytics, detecting positive trends, relationships, and sentiment.
In order to deliver these real-time insights powered by machine learning, data needs to be stored in a way that
can be accessed quickly, while still adhering to a highly-regulated environment. This means that the systems
need to be both secure and able to offer ever-lower latencies. A few years ago, many milliseconds of latency
in response times was the norm. Now high performance analytics dictates only latencies of well under one
millisecond as acceptable, and, very soon, that will reduce to a few tens of microseconds. However, the one
aspect that does not change is the universal requirement for consistent, repeatable, and predictable data
management regardless of where your data resides.
The technology leading the next wave in the fight for even lower latencies and higher performance is NVMe.
To date, industries have reused tried the tested storage protocols SAS and SATA. That’s fine, but
these protocols were designed for the mechanical HDD age and carry the baggage for managing HDDs.
While we were all taught that the HDDs are ‘random access’ devices in school, HDDs are limited by one set of
read / write heads that can only be in one place at one time. If you share HDDs with busy workloads then
achieving the nirvana of consistent low latency was difficult, hence the admin overheads associated with
short-stroking, load balancing and queuing.
Flash Storage, on the other hand, is like memory. It can be accessed in parallel, by many applications at the
same time, at extremely low latencies. This is where NVMe delivers. It is designed to drive parallel access to
solid state storage technologies, addressing next generation bandwidth, IOPs, and latency requirements.
The latest FAS hybrid storage systems from NetApp include NVMe. The FAS9000 supports up to
16TB of NVMeFlashCache, while the All Flash A-Series will easily add NVMe and NVMf technologies
in the near future.
There is no doubt in my mind that analytics and business intelligence will be a top-of-mind project for many
CIOs the 2017 as these topics underpin the value derived from digital transformation.
If you are looking to meet the challenges of digital transformation and analytics, then NetApp All-Flash
technologies are designed to deliver interoperability and future proofing with confidence.
Ask yourself the following questions:
If you are looking for more information here are useful links:
Hybrid FAS and E-Series Systems
All Flash Systems
Laurence James, on behalf of the EMEA Products and Solutions Team