Skip directly to content

White Papers & eBooks

VoltDB was engineered to handle Fast Data. We’re so excited by its capabilities that we worked with O’Reilly Media to write the eBook, Fast Data and the New Enterprise Architecture. Download the eBook today to learn more about Fast Data and the new enterprise data architecture—a unified data pipeline for working with Fast Data (data in motion) and historical Big Data - together. Written by VoltDB Co-Founder & Chief Strategy Officer, Scott Jarr.

VoltDB Technical Overview is a white paper that covers VoltDB architectural concepts, identifies popular use cases and provides information about how to get started using VoltDB.

Successfully writing applications to manage fast streams of data generated by mobile, smart devices, social interactions, and the Internet is development’s next big challenge. This VoltDB white paper explores three different approaches – fast online transaction processing (OLTP) solutions; streaming solutions, e.g. complex event processing (CEP) systems and newer open-source tools; and fast online analytical processing (OLAP) solutions, assessing the strengths and weaknesses of the three contenders’ underlying architectures and providing guidance in the selection of an approach to solving the fast data challenge.

The availability and abundance of fast, new data presents an enormous opportunity for businesses to extract intelligence, gain insight, and personalize interactions of all types. As applications with new analytics capabilities are created, what were two separate functions – the application and the analytics – are beginning to merge. CTOs and development managers now realize they need a unifying architecture to support the development of data-heavy applications that rely on a fast-data, big-data workflow. This white paper looks at the requirements of the fast data workflow and proposes solution patterns for the most common problems software development organizations must resolve to build applications – and apps – capable of managing fast and big data.

In Q1, 2014, VoltDB polled database managers, analysts, administrators and other IT professionals about the databases they use, the results of their Big Data projects, and opinions about Big Data technology advancements. The 2014 Big Data Survey reveals that most organizations cannot access, let alone utilize, the vast majority of the data they collect, and exposes a major Big Data divide: the ability to successfully capture and store huge amounts of data is not translating to improved bottom-line business benefits. What’s to blame? Deficiencies in database performance.

Computer architectures are moving towards an era dominated by many-core machines with dozens or even hundreds of cores on a single chip. This unprecedented level of on-chip parallelism introduces a new dimension to scalability that current database management systems (DBMSs) were not designed for. In particular, as the number of cores increases, the problem of concurrency control becomes extremely challenging. With hundreds of threads running in parallel, the complexity of coordinating competing accesses to data will likely diminish the gains from increased core counts.