Why build your own Big Data platform?

I spoke with a couple of customers about their Big Data projects at CiscoLive 2017, and have the feeling that the majority of them are facing the same problems. Big Data and Analytics is a new workload and application for them and the 2 main issues are:

- sizing is not as easy as sizing legacy workloads (databases, file sharing, ...)

- the implementation is long and complex

 

Outscale is introducing a new Big Data PaaS and I met David Chassan from Outscale to discuss these requirements with him.

 

Christian Lorentz (CL): In a nutshell, what is the new Outscale platform about?d et c.jpg

David Chassan (DC): Outscale built a Big Data PaaS solution based on MapR solution and NetApp storage, specially designed for VARs and optimized for the Cloud. It permits to start in less than 5 minutes a complete MapR platform with an hourly-billed MapR license.

 

CL: What impact will it have on your customers?

DC: It will permit our Partners to increase their revenue by winning more deals, managing more efficiently their project portfolio and having a better trained pool of Big Data experts. Partners can choose between 3 pre-packaged clusters to start with, and scale up and down as required by the customer.

 

CL: How should it strengthen Outscale's businesses?

DC: As it will simplify the deployment of NetApp-based Big Data platforms on the Cloud plus proposes an alternative business model, it will bring new successful partners and reinforce our leadership. MapR as a Service solution means more agility for our partners and a new value proposition to the market.

 

CL: Are there any figures/proof points you’d be able to provide?

DC: First, whether a projects needs a Sandbox or a 15 nodes cluster with 6 TB of NetApp storage, it will be available in less than 5 minutes

Second, 2 Partners and 5 Customers are already working with MapR as a Service.

It sounds good for the future!

 

Visit Outscale website to get more information.