"Confluent Coffee Break" (Video & Transcript)
Whitney Steward is a Senior Customer Success Technical Architect at Confluent, where she’s worked for four years. Prior to Confluent, she held roles in customer success and engineering at HashiCorp and New Relic.
Confluent is creating the foundational platform for data-in-motion. With Confluent, organizations can harness the full power of continuously flowing data to innovate and win in the modern digital world. Thank you to Confluent for partnering with WEST to present the Engineering Your Future Summit.
Transcript has been edited for clarity.
Whitney Steward: Hi, my name's Whitney Steward. I am just south of Boston in Pawtucket, Rhode Island. I've been at Confluent for about four years now, just about four years. I work as a senior customer success technical architect. We've also coined this internally as a siesta because it is a lot to say.
Today I have just a little presentation about some of the things, what Kafka is, and then I'll talk about some of the fun stuff we have here planned at Confluent. And so let me go ahead and share my screen. Okay, awesome.
All right, so I already mentioned that my name is Whitney. I like to garden. That's a fun fact about me. I ruined cucumbers this year, and I'm pretty sad about it, but I think next year I'll know how to ace it.
So like I said, I'll go over some components that make up Kafka, and from there we'll jump into stuff that's fun. And let's see. So what is Kafka? If you're new to Kafka, you may think of the writer. And there's a fun fact about that is our CEO Jay, he basically was a big lit major, and he decided to name Kafka. Kafka, as it is a system optimized for rating. So that's the fun fact around that. And essentially, what Kafka is is a distributed event streaming platform.
And so here's a nice little image that can kind of help you understand what that means. So we've been faced with a paradigm shift in processing on how we process data. We are moving from a slower batch focus processing to stream processing.
And I've explained this to my grandpa, who loves to tell me horror stories of, "Hey, back in my day, this is how we did things." But rather than waiting minutes or hours or even days to consume data in batches, we can instead react to smaller amounts of data and produce actionable results quickly, essentially in real time.
And when you're moving to stream, processing can help to increase the accuracy of your data and the results, and make your systems more reactive and increase the resiliency of your overall systems.
And Kafka does this for you, serving as a messaging system, a persistent storage layer, and processing layer. And we won't go too deep into that today because I have 15 minutes.
So again, we need to be able to really shift our mindset on how to use Kafka effectively. And to do so, you need to start thinking in events. So, as humans and programmers, and users, we can already kind of think and process events. When we submit web forms to update information, when we look at logs, when we react to notifications, these are essentially are all events. So we have examples of events, and essentially these are pieces of information that describe something that has happened.
But we also need to know when that thing occurred, who and what was involved. And then another key component of an event is that events are meant to be immutable, and this is due to the fact that they describe things that have already happened.
So being immutable is an unfortunate fact of life. We can't go back in time. We don't have a time machine. That's just the way of life.
Based on what we just learned, you can probably see how Kafka facilitates the movement of events across a system for real-time processing. If we think about it, Kafka allows us to communicate that immutable events are occurring throughout your system in real time, and then gives you the power to process and react to those events in real time. And that's pretty wild. And then we'll talk a little bit about how it does that and look deeper into Kafka architecture.
So this is just an example of producing to Kafka. So here you can see that the messages are being added to the end of the log. This is what we mean by being immutable. And I have another representation of this in a later slide.
So the primary unit of storage in Kafka is a topic. And topics typically represent an individual data set consisting of, you guessed it, events. So we produce data as key value pairs in Kafka using separate clients called Kafka producers. We can have as many producers as we want writing to as many topics as we want.
When I talk to customers, we have strategies around this, depending on their business use cases, but you can scale up how many you want, dependent on throughput and other requirements that you might have.
Okay, so the next slide, this is the nodes of a Kafka cluster are called brokers. And these brokers can be things like bare-metal or VMs or containers, et cetera. And in a simple cluster where we have a handful of topics, each with three partitions, the partitions might be distributed like this, and in the event of a broker has a problem and goes down, at least you wouldn't lose all partitions in your topic. But in Kafka, we can do better than that. And that's where replication comes in.
So, replication you can configure, and this determines how many copies of a given partition will exist across the Kafka cluster. And the replication factor of three is pretty standard. So, in the case you see that even if a broker goes down, you still have two copies of your data somewhere else in the cluster. And that's pretty cool.
So within Confluent, we have Confluent Platform, we have Confluent Cloud, really dependent on what the use case is for a given customer or for your use case. What's fortunate about using Confluent is we have these things called connectors. And within connectors, you can basically pull things from external services in or pull them out. And behind the scenes, if you did a Scooby-Doo reveal, that essentially is a producer API and consumer API. So Source Connector is a producer, and a Sink Connector is a consumer. And these allow us to really interact with different systems that you're already working with. And you can even create your own connector if you want to. If you use Confluent Cloud, we have bring your own connectors, which is really fun. I've set 'em up myself.
Yeah, so within Kafka Connect, like I mentioned, it's a tool that bridges the gap between Kafka and external systems and the ingest stage, it allows you to point to a data source programmatically and collects that data, and writes it into Kafka. And it's simple, requires no code, and then you can toggle it to your needs. Depending on what those are, you can change different configuration settings.
And just in Confluent Cloud alone, there's a dozen fully managed connectors available. And you can also, like I mentioned, you can customize your own. And just a little pivot here.
We currently, at Confluent, we have a lot of events coming up. I was fortunate when I started at Confluent that I got to mentor with a woman as soon as I started. Currently, we have a program that is more formalized for that mentorship.
For AAPI month, we have a number of different events that a lot of my friends have put together, including things like Bollywood workout classes and fortune cookie-making classes. Last year we had a super fun class where we got to make bubble tea and go into making Boba from scratch, which was a really cool experience.
And we also get to use different tools internally, like Glean AI and Zoom AI tools. This is really helpful if we need to formalize some of our notes for customers or if we need a specific resource. It helps scrape things from different resources, whether that's Slack or Zendesk, and gives us a really neat tool that we can use proactively.
And then in terms of these pictures that I put here, this is our growth kickoff that happens annually. Usually it's the end of January, early February. The last couple ones have been in Vegas, and the next one is currently going to be in Seattle in February.
So if you have any questions, feel free to reach out. My name is Whitney Steward. I'm the only Whitney at Confluent. Send me an invite on LinkedIn. I'm happy to help in any way or capacity that I can. And that's all I have today. Thank you for letting me talk today. Been a great opportunity.
Karen Ko: Thanks, Whitney. Maybe we can turn the slides off and see if folks have any questions regarding job opportunities or what the culture has been like at Confluence and working there. I'm seeing some thank yous in the chat. What's your favorite part of working at Confluent?
Whitney Steward: The community. I have a lot of friends at Confluent, and it's always nice to bounce ideas off of each other. I think we call ourselves the Siesta org. None of us are taking naps, but we just really like bouncing ideas off, and if somebody needs help, everyone's quick to help that person. And I think that's a really good company culture to have.
Karen Ko: Amazing. And it sounds like there are some ERGs that are at Confluent.
Whitney Steward: Yes, definitely.
Karen Ko: Amazing. And Michelle says she loves your glasses and pink walls [cross-talk]. Amazing. And then if folks are interested in applying for roles at Confluent, what's the best way to go about that? Should they just apply online, or can they reach out to you about learning more about the company? What do you think?
Whitney Steward: I think as far as what you're comfortable with, what your comfort level is, if you feel comfortable applying, apply. If you have any questions, reach out to me, and I'm happy to talk to you about the company culture or answer any questions you may have.
Karen Ko: Amazing. Well, thank you so much for sharing more about Confluent and Kafka with us. This was so fun to learn a little bit more about the inner workings of Confluent, and we are so thankful for that partnership with Confluent to be able to help us put this conference together. So thank you, Whitney, and thank you, everyone, for joining us in this call, and we'll see you in the next session. Thank you. Bye. Sarah loves working with Kafka in her current role. Excellent. Okay, we'll see you in the next one.