VoltDB CTO, Ryan Betts and Flytxt CTO, Prateek Kapadia discuss how the modern telecommunications data center environment must cater to billions of high frequency events daily. Technology and Business teams are faced with a challenge to architect and manage analytics platforms that extract optimum value from these event streams. Combining fast data with the advanced big data analytics through mining large volumes of historical data is increasingly becoming a key technology enabler for competitive differentiation and sustained economic value generation. Tapping into the value of data in real-time – the moment it arrives – is a significant opportunity but it requires the ability to track billions of events, generate real-time triggers from those billions of events reflecting the contextual usage and deviation in defined behavior, as well as to take right action at right time through the right channel instantaneously.
Writing applications on top of streaming data requires both scalable high throughput event processing, as well as efficient large volume storage. At scale, this requires combining best-in-class tools to create a complete solution. Real-time applications and the increasingly fast data streams created by personalized devices, IoT, and M2M, exceed the processing and storage capacity of legacy databases.
Big Data is transforming the way enterprises interact with information, but that’s only half the story. The real innovations are happening at the intersection of Fast Data and Big Data. Why? Because data is fast before it’s big. Fast Data is generated by the explosion of data created by mobile devices, sensor networks, social media and connected devices – the Internet of Things (IoT).
Listen to CTO Ryan Betts as he explains how stream processing was not designed to serve the needs of modern Fast Data applications. Employing a solution that handles streaming data, provides state, ensures durability, and supports transactions and real-time decisions is key to benefitting from fast data.
Listen to Walt Maguire, Chief Field Technologist of HP Vertica and Ryan Betts, CTO of VoltDB discuss the Fast + Big Data challenge, and new approach required for both corporate data architectures and the data management systems that enable them.
In this webcast, CMO Peter Vescuso and Dr. Michael Stonebraker discuss the new corporate data architecture and the necessary technology components for facing this data management challenge. Listen as they discuss the “one-size-never-fits-all” perspective for developing the ideal architecture for managing, and maximizing the value of fast, big data in your organization.
In this on-demand session, Chris Wright, co-founder and CTO of deltaDNA, and Ryan Betts, CTO of VoltDB discuss the transformative impact of micro-personalization on F2P games, the large role in-memory analytics and high-performance data management play in building modern gaming platforms – and the power of deltaDNA’s VoltDB-enabled real-time Player Relationship Management platform for maximizing player engagement across F2P games, social casino and real-money gambling.
In the third and final “Stonebraker Says” session, Dr. Stonebraker explained why optimizing fast data is the modern OLTP problem – and how an OLTP solution can help you extract the most value out of that data with extensions to main memory, support for streaming, and larger-than-memory applications.
The second in a series of informative, educational webcasts presented by Michael Stonebraker. In the second “Stonebraker Says” webcast, Dr. Stonebraker explained why you need to fix your database design – or risk being left behind in a world that’s smarter and faster than what your legacy system can handle.
In this session, Ari Gorman, CTO of Novatel Networks, and Ryan Betts, CTO of VoltDB, discuss today’s data challenges and how Novatel uses VoltDB to offer its customers superior value and maximum efficiency through expedited VoIP call routing.