Console Producer and Consumer Basics using Kafka()

  本篇文章为你整理了Console Producer and Consumer Basics using Kafka()的详细内容,包含有 Console Producer and Consumer Basics using Kafka,希望能帮助你了解 Console Producer and Consumer Basics using Kafka。

   Confluent Developer Live Free live professional training

  
DESIGN. BUILD. RUN.

   Build a client app, explore use cases, and build on our demos and resources

   Start Building

  
Language Guides Build apps in your favorite language

  
Confluent proudly supports the global community of streaming platforms, real-time data streams,

   Apache Kafka®️, and its ecosystems

   Learn More

  
Kafka Summit and Current Conferences Premier data streaming events

  
Meetups & Events Kafka and data streaming community

  
Ask the Community Community forums and Slack channels

  
Community Catalysts Sharing expertise with the community

  
Use the Cloud quick start to get up and running with Confluent Cloud using a basic cluster

   Learn more

  
Documentation Guides, tutorials, and reference

  
Confluent Cloud Fully managed, cloud-native service

  
Confluent Platform Enterprise-grade distribution of Kafka

  
Confluent Connectors Stream data between Kafka and other systems

  
Clients Use clients to produce and consume messages

  
What are the courses?

   Video courses covering Apache Kafka basics, advanced concepts, setup and use cases, and everything in between.

   View all courses

  
A wide range of resources to get you started

   Start Learning

  
Confluent Developer Live Free live professional training

  
DESIGN. BUILD. RUN.

   Build a client app, explore use cases, and build on our demos and resources

   Start Building

  
Language Guides Build apps in your favorite language

  
Confluent proudly supports the global community of streaming platforms, real-time data streams,

   Apache Kafka®️, and its ecosystems

   Learn More

  
Kafka Summit and Current Conferences Premier data streaming events

  
Meetups & Events Kafka and data streaming community

  
Ask the Community Community forums and Slack channels

  
Community Catalysts Sharing expertise with the community

  
Use the Cloud quick start to get up and running with Confluent Cloud using a basic cluster

   Learn more

  
Documentation Guides, tutorials, and reference

  
Confluent Cloud Fully managed, cloud-native service

  
Confluent Platform Enterprise-grade distribution of Kafka

  
Confluent Connectors Stream data between Kafka and other systems

  
Clients Use clients to produce and consume messages

  
Question:

   What is the simplest way to write messages to and read messages from Kafka?

  
So you are excited to get started with Kafka, and youd like to produce and consume some basic messages, quickly. In this tutorial, well show you how to produce and consume messages from the command line without any code.

  
Install Docker Desktop (version 4.0.0 or later) or Docker Engine (version 19.03.0 or later) if you don t already have it

  
Install the Docker Compose plugin if you don t already have it. This isn t necessary if you have Docker Desktop since it includes Docker Compose.

  
Start Docker if it s not already running, either by starting Docker Desktop or, if you manage Docker Engine with systemd, via systemctl

  
Verify that Docker is set up properly by ensuring no errors are output when you run docker info and docker compose version on the command line

  
Next, create the following docker-compose.yml file to obtain Confluent Platform (for Kafka in the cloud, see Confluent Cloud).

  

---

 

  version: 2

  services:

   zookeeper:

   image: confluentinc/cp-zookeeper:7.3.0

   hostname: zookeeper

   container_name: zookeeper

   ports:

   - 2181:2181

   environment:

   ZOOKEEPER_CLIENT_PORT: 2181

   ZOOKEEPER_TICK_TIME: 2000

   broker:

   image: confluentinc/cp-kafka:7.3.0

   hostname: broker

   container_name: broker

   depends_on:

   - zookeeper

   ports:

   - 29092:29092

   environment:

   KAFKA_BROKER_ID: 1

   KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181

   KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT

   KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://broker:9092,PLAINTEXT_HOST://localhost:29092

   KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1

   KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0

  

 

  
Your first step is to create a topic to produce to and consume from. Use the following command to create the topic:

  

docker exec -t broker kafka-topics --create --topic orders --bootstrap-server broker:9092

 

  

 

  
Next let s open up a console consumer to read records sent to the topic you created in the previous step.

  
From the same terminal you used to create the topic above, run the following command to open a terminal on the broker container:

  

docker exec broker bash

 

  From within the terminal on the broker container, run this command to start a console consumer:

  

kafka-console-consumer \

 

   --topic orders \

   --bootstrap-server broker:9092

  

 

  The consumer will start up and block waiting for records, you won t see any output until after the next step.

  
To produce your first record into Kafka, open another terminal window and run the following command to open a second shell on the broker container:

  

docker exec broker bash

 

  From inside the second terminal on the broker container, run the following command to start a console producer:

  

kafka-console-producer \

 

   --topic orders \

   --bootstrap-server broker:9092

  

 

  The producer will start and wait for you to enter input. Each line represents one record and to send it you ll hit the enter key. If you type multiple words and then hit enter, the entire line is considered one record.

  
Try typing one line at a time, hit enter and go back to the console consumer window and look for the output. Or, you can select all the records and send at one time.

  

the

 

  jumped over the brown cow

 

  Once you ve sent all the records you should see the same output in your console consumer window. After you ve confirmed receiving all records, go ahead and close the consumer by entering CTRL+C.

  
In the first consumer example, you observed all incoming records because the consumer was already running, waiting for incoming records.

  
But what if the records were produced before you started your consumer? In that case you wouldn t see the records as the console consumer by default only reads incoming records arriving after it has started up.

  
But what about reading previously sent records? In that case, you ll add one property --from-beginning to the start command for the console consumer.

  
Next, let s open up a console consumer again. This time you ll read everything your producer has sent to the topic you created in the previous step.

  
Run this command in the container shell you created for your first consumer and note the additional property --from-beginning:

  

kafka-console-consumer \

 

   --topic orders \

   --bootstrap-server broker:9092 \

   --from-beginning

  

 

  After the consumer starts you should see the following output in a few seconds:

  

the

 

  jumped over the brown cow

  how now

  brown cow

  all streams lead

  to Kafka!

  

 

  One word of caution with using the --from-beginning flag. As the name implies this setting forces the consumer retrieve every record currently on the topic. So it s best to use when testing and learning and not on a production topic.

  
Again, once you ve received all records, close this console consumer by entering a CTRL+C.

  
Kafka works with key-value pairs, but so far you ve only sent records with values only. Well to be fair you ve sent key-value pairs, but the keys are null.

  Sometimes you ll need to send a valid key in addition to the value from the command line.

  
To enable sending full key-value pairs from the command line you add two properties to your console producer, parse.key and key.separator

  
Let s try to send some full key-value records now. If your previous console producer is still running close it with a CTRL+C and run the following command to start a new console producer:

  

kafka-console-producer \

 

   --topic orders \

   --bootstrap-server broker:9092 \

   --property parse.key=true \

   --property key.separator= :

  

 

  Then enter these records either one at time or copy-paste all of them into the terminal and hit enter:

  

key1:what a lovely

 

  key1:bunch of coconuts

  foo:bar

  fun:programming

 

  
Now that we ve produced full key-value pairs from the command line, you ll want to consume full key-value pairs from the command line as well.

  
If your console consumer from the previous step is still open, shut it down with a CTRL+C. Then run the following command to re-open the console consumer but now it will print the full key-value pair. Note the added properties of print.key and key.separator. You should also take note that there s a different key separator used here, you don t have to use the same one between console producers and consumers.

  

kafka-console-consumer \

 

   --topic orders \

   --bootstrap-server broker:9092 \

   --from-beginning \

   --property print.key=true \

   --property key.separator= -

  

 

  After the consumer starts you should see the following output in a few seconds:

  

null-the

 

  null-lazy

  null-fox

  null-jumped over the brown cow

  null-how now

  null-brown cow

  null-all streams lead

  null-to Kafka!

  key1-what a lovely

  key1-bunch of coconuts

  foo-bar

  fun-programming

  

 

  Since we kept the --from-beginning property, you ll see all the records sent to the topic. You ll notice the results before you sent keys are formatted as null- value .

  
Go back to your open windows and stop any console producers and consumers with a CTRL+C then close the container shells with a CTRL+D command.

  
Instead of running a local Kafka cluster, you may use Confluent Cloud, a fully-managed Apache Kafka service.

  
Sign up for Confluent Cloud, a fully-managed Apache Kafka service.

  
After you log in to Confluent Cloud Console, click on Add cloud environment and name the environment learn-kafka. Using a new environment keeps your learning resources separate from your other Confluent Cloud resources.

  
From the Billing payment section in the Menu, apply the promo code CC100KTS to receive an additional $100 free usage on Confluent Cloud (details).

  
Click on LEARN and follow the instructions to launch a Kafka cluster and to enable Schema Registry.

  
Next, from the Confluent Cloud Console, click on Clients to get the cluster-specific configurations, e.g. Kafka cluster bootstrap servers and credentials, Confluent Cloud Schema Registry and credentials, etc., and set the appropriate parameters in your client application.

  
Now you re all set to run your streaming application locally, backed by a Kafka cluster fully managed by Confluent Cloud.

  以上就是Console Producer and Consumer Basics using Kafka()的详细内容,想要了解更多 Console Producer and Consumer Basics using Kafka的内容,请持续关注盛行IT软件开发工作室。

郑重声明:本文由网友发布,不代表盛行IT的观点,版权归原作者所有,仅为传播更多信息之目的,如有侵权请联系,我们将第一时间修改或删除,多谢。

留言与评论(共有 条评论)
   
验证码: