Hyper’s Origins: A Q&A with Tobias Muehlbauer and Allan Folting

We chatted with Tobias Muehlbauer and Allan Folting about the origins of the Hyper Data Engine.

Hyper, our patent-pending data engine, is the culmination of decade-long academic research originating in the halls of Technical University Munich. Through a proud acquisition, the technology made its way to Tableau in March of 2016 and has arrived at your fingertips today. We spoke to Tobias Muehlbauer (Development Manager at Tableau and co-founder of Hyper) and Allan Folting (Senior Manager, Engineering) about the origins of the technology, the rise of our Research and Design departments, and the community’s impact on this groundbreaking technology.

1. What is your academic background given Hyper stemmed from academia?

Tobias Muehlbauer (TM): I started my bachelor of science at Technical University Munich, then went on to do a master in software engineering here. I visited the US then and spent time at Stanford and University of Illinois Urbana-Champaign as an external researcher. After that, I continued with my PhD in Munich on the Hyper project.

2. What was the original intention of Hyper and the technology?

TM: Hyper started as an academic project around 10 years ago at Technical University Munich, an institution comparable to Stanford. And when we started, there was a lot of movement in the database market. Traditional technologies didn’t satisfy performance and functional requirements for modern applications any longer. There were a lot of specialized systems coming—there was Hadoop, NoSQL systems, and specialized engines for transactions analytics.

With Hyper, we wanted to create something else. We wanted to build a relational system from the ground up, questioning the traditional design decisions and optimizing for modern hardware. Some of the things that followed is we optimized for in-memory processing and we took modern CPUs into account that have a lot of cores, but also are more complex.

We also focused on combining transactional systems and analytics, bringing these specialized systems together in one system to unify transactions, data ingestion, and analytics.

Why did we do that? The answer is simple. If you have specialized systems, your data is in different places. And it might be that your analytics system has a stale view on your data. With Hyper, you can really do analytics on the freshest data set. That was our main intention in the beginning, and it took us years to really build to the performance and functionality we wanted to see and have today.

3. What is so important for people to know about Hyper’s origins?

Allan Folting (AF): One of the most important activities was starting Hyper’s alpha programs very early. The whole reason we looked at doing something to replace our existing data engine was the challenges customers ran into of not being able to create extracts in the time they had (i.e. overnight to prepare for analysis the next morning). Or, they didn’t want to wait five days for a large extract to be created.

We reached out to customers early on with builds of this system and looked at where we were with extract rates, and where we were with query performance once extracts were created. Along the way, we kept a pulse of a growing set of select customers and I can't wait to see them cut down that window of time to create larger extracts that either took way too long or failed before.

I'm grateful we had a large set of customers willing to experiment with the product since it wasn't quite stable during the very early alpha stages, as you can imagine.

4. Can you talk about the first time you encountered Tableau?

TM: I first made contact with Tableau at an academic conference in Melbourne. I was introduced to Patrice Pelland, director of the database group at Tableau then. I didn't know much about Tableau. For me it was more a visualization company, so I was not afraid of how much research and database technology was in Tableau.

And we shared great conversations during the conference with employees that led to interest in each other's product and a mutual feeling of unique energy. If you bring people together who are like-minded and passionate about their mission, you come up with great ideas. That happened in Melbourne, so from there began steady, back-and-forth conversations. Later that year, Christian Chabot came and visited Munich.

5. Allan, what was your first interaction with Tobi and Hyper? What were your first impressions when experiencing the power of Hyper?

AF: In early 2016, I learned about Hyper and the acquisition plans. Tobi and I met in March 2016 when he and the Hyper team came to visit Seattle. I had very high hopes, and I was not disappointed.

They came to work with us and dive deeper into the details of Hyper as we planned how to integrate the product into Tableau, but also thought longer term about what we might do in subsequent releases. Some initial things we did were very fascinating in-depth technical talks, which made us even more excited about the project. But, we also worked on figuring out how to build and expand the research and development office in Munich, which I think is important to highlight.

To remain innovative going forward, I feel it’s critical we maintain a strong relationship with academia and the university in Munich, including the professors and students. We greatly treasure the relationship and value their partnership.

6. Can you tell us about the Research and Design (R&D) unit in Munich – and at Tableau at large?

AF: The first thing I want to highlight is some of the tenets that Tobi outlined before around Hyper being able to bring specialized systems into one, and working on the same state of the data. We're very passionate about maintaining these two areas.

It's very easy when you integrate a technology like this into Tableau or another product to lose sight of the dimensions or things that made it special. Consequently, one of the things that we place value on and that we work together with everybody here—researchers and full-time employees—is making sure whenever we make decisions for Tableau functionality, we stay true to the tenets. Sometimes that means we must do something a little more innovative, a little more in-depth to not hurt transactional or more analytical workloads by making design decisions in one or the other. That is one key area that we're proud of and we're benefiting from with the composition of the team and the university in Munich.

TM: We're proud to have built a great team from the ground up in Munich. When we started, we had three full-timers and three part-timers and that was only one-and-a-half years ago. Together, we now have 20 people in Seattle and 20 in Munich working on Hyper.

7. What does collaboration look like on a daily basis?

AF: To be honest, coming into it, we had concerns just given the nine-hour time difference from Seattle to Munich. We do a little bit extra to go out of our way to communicate.

But I would say that I'm happy to report this is working very well. We visit each other quite frequently because it helps to have close-knit, interactive meetings. But with our video conferencing setups, and flexibility on both ends, it's working well.

We’re also trying to develop that further because we have other locations in the world, and we generally want to get better at collaboration—either with individuals who are remote or whole teams like this.

TM: I agree. I think the most important part is that people get to know each other in-person. There is nothing more important than regular interactions—at least a few times a year we need to have dinner together. That helps.

8. Was there any point during the building process when you experienced challenges—felt like it might not come together or did things line up perfectly for Hyper?

TM: Academia is always open-ended, and we tried a lot of things to find the perfect solution. And what you see in Hyper today, is the result of years of trial and error. Not everything's perfect, but we dug into the challenges. And the fun part was realizing we solved a large portion of the technical issues, so we chose to bring the technology to market. But that presented its own set of challenges.

We incrementally made progress, Hyper got better, and more people grew interested in it. We received lots of positive feedback where people started asking us, “Hey, we want to use it. Can we try it in production?” And then as an academic, you're of course honored. But then you start thinking, “what now?” And the obvious choice was to form a company and productize it.

9. How did the community help fuel the passion of this project within internal teams?

AF: That is a great point you mention. Every time we shared community feedback with the team, it was like everybody grew five inches and were so proud. Every day we deal with all the problems, and we're trying to get this done—there's bugs and we have some things still not complete. And so, we're focused a lot on what you might consider the negatives.

I can't even explain how important it is to get that boost from positive feedback. There is a project manager on my team that sent out an email with a customer’s findings, which were very positive. Everything was faster and they were super happy. People came to my office during that day with big smiles on their faces, and just another level of commitment. It's a long stretch to work on a project like this, so it's a takeaway for me to collect and share more feedback in the future. I didn't anticipate it would be so powerful, but it's unbelievable.

10. How was the community involved in the development of Hyper?

AF: We’ve heard and seen a lot of excitement and hype around providing these new capabilities of speed increase and querying extracts. With that, perhaps expectations are even higher in this first version, but we spent considerable time integrating the technology.

Several people have even stopped us at conferences and I received hugs during the recent Tableau Conference (TC) due to the fact we're delivering on some long-desired wishes from customers. There’s definitely a lot of positive feedback and very high levels of expectations.

TM: And a big thanks to all our customers. We got great feedback at both TCs and through the alpha and beta program.

11. What are you anticipating for the Hyper launch?

TM: Our vision and mission to develop Hyper as the fastest-possible, general-purpose data engine hasn't changed. We're still on that mission, and that’s our foundation for things to come.

AF: This first version of Tableau with Hyper offers the benefits of faster extract creation, fast query performance on extracts, and in many cases, better scalability and stability in several environments. We're super proud of that, and it's been a long journey to replace the existing Tableau data engine with Hyper because of all the code that hooks into it. But we're over that hump.

There are several things Hyper brings to the table that we're not yet taking advantage of. I’m looking forward, for example, to doing more things with even lower latency from data that gets conceived in the system and becomes available in Tableau visualizations. We're also looking at the strength and abilities to build deeper, analytical capabilities into the system so analysts can do more advanced examination, whether it smells a bit like machine learning or data mining, or whether it’s statistical functions at speed, and doing that physically closer to the data at high speeds. These are areas we’ve only scratched the surface with in this first release.

I'm very proud of the team for pulling this replacement product off in about 18 months. It was a major endeavor.

12. What is next for the R&D units?

AF: We want to keep investing. This project has proven to be extremely fruitful. I'm proud of the team, the level of innovation, the level of their delivery, and the collaboration with Technical University Munich – their advisors’ input and feedback is priceless.

For next steps with Hyper, we have two stages. We have what I call a shorter-term plan where we've allocated quite a bit of time to react to customer feedback. Since it's the first time this ships as a commercial product, we want to be ready and able to deal with feedback.

Then, we have some features we're delivering next quarter, but they're not major features. Again, we want to account for customer feedback, and we’re finishing additional performance improvements that weren’t ready for the first release.

13. Do you have any advice for academic entrepreneurs as far as continuing to move forward, not losing hope in whatever technology they invent in the next decade?

TM: My advice is “believe in yourself.” Sometimes a project needs extra work on the side. There are things that aren’t interesting from a research perspective. We had to build a database system from the ground up and there are certain components that you can’t publish in a paper. But if you do the extra measures, and believe in what you’re creating, people will be interested and the industry will pick it up. And there is a great chance in just trying; you can’t lose.