EMC and Pivotal are thrilled to announce the EMC Data Computing Appliance (DCA) v3, the industry’s first and only DW appliance with the open source-based data warehouse software, Pivotal Greenplum. DCA v3 is an integrated analytics platform that comes loaded with Pivotal Greenplum - Pivotal’s commercial release based on the open source project Greenplum Database®, optimized to run on powerful EMC hardware. DCA v3 customers can now get the advanced functionality and analytics capabilities of the market’s leading open source data warehouse in an appliance form factor delivering unrivaled time-to-insight.
While many companies have taken on big data projects with success and others have faced challenges, there are some key questions to nail down before the project turns from an idea to proposal. In this post, data market strategist, Jeff Kelly poses 5 of the most important things you can ask yourself about leadership, skills gaps, change management, and more.
As 2016 kicked off, we are seeing the formation of a different perspective on data science. While some people grew weary of the over-used terms in media or suspiciously viewed the terms as the new, sexy way to say analytics, we think a corner has been turned. This post rounds up the top stories we’ve seen in the past month and lays out the importance of data science from the perspective of organizations, finance, personalization, The World Economic Forum, and more.
In this month’s BUILD Newsletter, we focus new platforms that are quickly advancing in 2016—namely robotics, drones, voice recognition, NFC, wearables, smart TVs, 3D cameras, and much more. Of course, we will also update you on related Pivotal tech projects.
In 2016, with the digital transformation revolution sweeping its way into mainstream corporate agendas, there is increasing pressure to extract as much value from Big Data as possible … and fast! The good news is the underlying technologies that support Big Data analytics are evolving at a rapid pace, providing the tools you’ll need to turn data into insight into action. As a result, expect another year of Big Data firsts, from mainstream adoption of real-time analytics to the rise of intelligent machines, over the next twelve months. In our annual post, here are Pivotal’s top data predictions for 2016.
Open source software, such as the Apache HadoopⓇ standard within the Big Data realm, has become the default and dominate choice when companies are choosing to deploy software. Not too long ago, most executives didn’t necessarily see how much of their operations run on open source. Now, they do. There are three key reasons why, and we outline these in this post along with an upcoming webinar on an Open Source Playbook for 2016.
After his first month at Pivotal, former Wikibon analyst Jeff Kelly starts his contributions to the Pivotal blog with a topic he holds in the highest regard: customer stories. Inspired to join Pivotal by our passionate customer successes, Jeff shares five stories from BMW, Purdue University, Time, WellCare and CoreLogic as key examples of the innovative ways Pivotal customers are putting Big Data to use and creating real value for themselves and their consumers.
As 2015 comes to an end, data science watchers and practitioners look to the year ahead and predict how the discipline will be transformed in 2016. But before we ring in the new year, here’s our roundup of the top data science news in December, both from Pivotal and beyond.
Former Wikibon analyst Jeff Kelly just joined the Pivotal Big Data team. Jeff has deep expertise in this market, having covered business intelligence, analytics, and big data for the past 8 years. In this post, we sit down to talk about his background, what brought him to Pivotal, his role here, and why big data matters to companies around the world.
When building systems to handle real-time or streaming data, we need to look at some architectural elements differently. We cannot rely on the patterns of the past, but we can learn from the mistakes and successes of others! In this episode, we explore some of the key considerations when designing software to handle real-time data.