Keynotes

Towards Hardware-Software Co-Design for Data Processing: A Plea and a Proposal
Jignesh Patel (Pivotal and University of Wisconsin)

Bio: Jignesh Patel is the Chief Scientist at Pivotal, and a Professor in Computer Sciences at the University of Wisconsin-Madison. He is the recipient of the Wisconsin “COW” Teaching Award, and the U. Michigan College of Engineering Education Excellence Award. He also has a strong interest in seeing research ideas transition to actual products. His Ph.D. thesis work was acquired by NCR/Teradata in 1997. In 2007 he founded Locomatix, which became part of Twitter in 2013, and seeded the technology that has now become Heron. Heron now powers all real-time services at Twitter. His last company, Quickstep Tech., which was a spinoff from the project by the same name at U. Wisconsin, was acquired by Pivotal in 2015. He is also the founder of the NEST entrepreneurship contest at U. Wisconsin. This contest encourages Wisconsin students to create their own startups, and has directly contributed to the creation of a number of startups in Madison. Jignesh was also recently named as one of the top technology entrepreneurs in Madison. He is an ACM Fellow, serves on the board of Lands’ End, and also serves on the board of a number of startups. He blogs at http://bigfastdata.blogspot.com.

Abstract: For decades, we the database folks have been playing a catch-up game to the changes made by the hardware folks (aka. architects). In the mid 90s, we realized that the architects added processor caches, so we starting rewriting data processing kernels to make better use of caches. Then, we realized that there is this thing called the TLB, misses to which are outrageously expensive. So, we went and fixed our software to make better use of TLBs. In the earlier part of this century, we realized that the processors have made tremendous advances in micro-architecture with features like out-of-order execution, so we started to think about how to react to that. We now find that in some cases we need to throw away changes that we made in reaction to some of the previous architectural events and go back to the drawing board. This game of waiting for architects to give us new architectural features and then us reacting to it, repeated in an endless cycle, is wasteful. This waiting game is especially vexing for us since architecture is now at a crucial pivot point where radically different architectures are now being proposed (e.g. integrated GPUs, HMC, Smart SSDs, etc.). Can we find a more synergistic and collaborative way of co-charting the future? To help out, from the database side, I have two very simple data processing kernels that I feel are likely to power a large fraction of analytical data processing workloads. Can we use these as a starting point to get the architects to help us figure out where various future architectural design paths are likely to lead, and thus have a more synergistic hardware-software co-design strategy for data?
 


Near Memory Databases and Other Lessons
Ryan Betts (VoltDB)

ryanbettsRyan Betts is CTO at VoltDB. He was one of the initial developers of VoltDB’s commercial product. Prior to joining VoltDB in 2008, Ryan was a software engineer at IBM. During a four-and-a-half year tenure, he was responsible for implementing device configuration and monitoring as well as Web service management. Before IBM, Ryan was a software engineer at Lucent Technologies for five years. In that role, he played an integral part in the implementation of an automation framework for acceptance testing of Frame Relay, ATM and IP services, as well as a high-availability upgrade capability and several internal components related to device provisioning. Ryan is an alumnus of Worcester Polytechnic Institute.

Abstract: This talk shares lessons and observations learned and gathered over seven years of building and selling a new database management system. Designing, building, selling and supporting a new database in the world of cloud, NoSQL, and big data is equal parts exhilaration, success, and bewilderment. This talk is a retrospection on lessoned learned having spent much of last decade turning a Stonebraker paper into code, code into product, and product into a business.