The Kafka Stream In Linux How It Works & Importance

Kafka flows seem to many students a frightening subject, but this is not necessary. Just think of the flow as a sequence of events. In fact, when I collected the information for this blog post, I joked that getting all this data is like drinking from a waterfall. Chad (the Training Architect who created our new Kafka course) was able to move forward and we went tangentially. This will help explain this:

The Kafka Stream In Linux How It Works & Importance
The Kafka Stream In Linux How It Works & Importance

Understanding Kafka Flows

Introducing the Kafka cluster, start with data on the river and waterfall. Water descends and falls into the lake (Kafka itself), and now you can do different things with it. Dig irrigation ditches to send some data in different directions. Treat the water in streams or do not do this, depending on its end-use. Make it drinkable in one (for drinking), filter out particles from it in another (for swimming), or leave it alone if people just want to water it in their gardens.

Kafka’s streams and themes are similar to irrigation ditches and water treatment processes. Does this make sense?

In the real world, Kafka flows can be:

  • Credit card transactions
  • Exchange trading
  • Parcel Delivery Information
  • Network events

There are more examples because Kafka flows can be anything. They do not rely on any external framework. It also means that developers can consume, process and create events in their own applications without having to worry about a strict set of rules for accessing it. Let’s go deeper into one and see how things work with which you are probably familiar with in practice…

Real-world example

Guess what? Netflix uses Kafka. They receive raw data (river and waterfall) from you (what you observed) and send them to Kafka. Then this data is somehow requested or processed (processed like water in our small analogy), and the result is what you see as “look further”. Each time you press a button on the Roku, it is a piece of data. (event, in Kafka’s terminology) dumping into the river.

A quick note: I have a real question about Kafka threads, do I ever run into Netflix. For some reason, the situation in my bedroom does not remember the last episode of The Walking Dead I watched in the living room.

Want to know the flows of Kafka?

Kafka seems to be on the way to replacing databases in many big data situations. If you are currently involved in the backend infrastructure of your organization, you need to check out Apache Kafka Deep Dive. If you are thinking about getting big data, then you need to check the course of Chad.

Leave a Reply

Your email address will not be published. Required fields are marked *