VoltDB was engineered to handle Fast Data. We’re so excited by its capabilities that we worked with O’Reilly Media to write the eBook, Fast Data and the New Enterprise Architecture. Download the eBook today to learn more about Fast Data and the new enterprise data architecture—a unified data pipeline for working with Fast Data (data in motion) and historical Big Data - together. Written by VoltDB Co-Founder & Chief Strategy Officer, Scott Jarr.
In this analyst report, Ovum Analyst Clare McCarthy discusses the pressing need for telcos to operate at Internet speed, but struggle to do so: They need to handle big data – and, increasingly, fast data – from a variety of sources and to take action in real time. VoltDB’s technology platform taps into live or in-session data flowing into the organization (such as clickstream and user-experience data), and analyzes it in real time. This provides telcos with a range of monetizable use cases that would not otherwise be possible.
Successfully writing applications to manage fast streams of data generated by mobile, smart devices, social interactions, and the Internet is development’s next big challenge. This VoltDB white paper explores three different approaches – fast online transaction processing (OLTP) solutions; streaming solutions, e.g. complex event processing (CEP) systems and newer open-source tools; and fast online analytical processing (OLAP) solutions, assessing the strengths and weaknesses of the three contenders’ underlying architectures and providing guidance in the selection of an approach to solving the fast data challenge.
VoltDB Technical Overview is a white paper that covers VoltDB architectural concepts, identifies popular use cases and provides information about how to get started using VoltDB.
The Gartner ODBMS Magic Quadrant (MQ) is considered the definitive source for competitive comparisons in the information technology industry. An MQ offers visual summaries and in-depth analyses of the direction and maturity of markets and the key vendors.
The availability and abundance of fast, new data presents an enormous opportunity for businesses to extract intelligence, gain insight, and personalize interactions of all types. As applications with new analytics capabilities are created, what were two separate functions – the application and the analytics – are beginning to merge. CTOs and development managers now realize they need a unifying architecture to support the development of data-heavy applications that rely on a fast-data, big-data workflow. This white paper looks at the requirements of the fast data workflow and proposes solution patterns for the most common problems software development organizations must resolve to build applications – and apps – capable of managing fast and big data.
In Q1, 2014, VoltDB polled database managers, analysts, administrators and other IT professionals about the databases they use, the results of their Big Data projects, and opinions about Big Data technology advancements. The 2014 Big Data Survey reveals that most organizations cannot access, let alone utilize, the vast majority of the data they collect, and exposes a major Big Data divide: the ability to successfully capture and store huge amounts of data is not translating to improved bottom-line business benefits. What’s to blame? Deficiencies in database performance.
Software designers and architects build software stack reference architectures to solve common, repeating problems. A great example is the LAMP stack, which provides a framework for building web applications. Big Data is data at rest; Fast Data is streaming data, data in motion. A stack is emerging across both verticals and industries alike for building applications that process these high velocity streams of data that quickly accumulate into the ‘Big Data lake.’ This new stack, the Fast Data Stack, has a unique purpose: to grab real-time data and output recommendations, decisions and analyses in milliseconds.
In this report by 451 Research Analyst, Jason Stamper, he will discuss how VoltDB’s in-memory relational database is aimed at the analysis of what the firm calls fast data: enabling companies to make real-time decisions on data as it arrives.
Computer architectures are moving towards an era dominated by many-core machines with dozens or even hundreds of cores on a single chip. This unprecedented level of on-chip parallelism introduces a new dimension to scalability that current database management systems (DBMSs) were not designed for. In particular, as the number of cores increases, the problem of concurrency control becomes extremely challenging. With hundreds of threads running in parallel, the complexity of coordinating competing accesses to data will likely diminish the gains from increased core counts.