Problems encountered
In the actual development process, because kafka needs to be used to push data, kafka needs to be used in order to be more convenient and subsequent other needs, it is conceivable during the development process to write a tool class to facilitate subsequent use. However, when testing a kafka server without authentication, it is normal to send it. However, the actual situation is that the other party's server needs authentication, which leads to the problem of not being able to connect to the other party's service and push failure, and you need to find the other party to confirm the other party's authentication configuration information. And when the query is processed, there is also a strange situation in verification, so this article will simply write about the problems encountered and the verification results.
The pit I encountered
1. When it comes to the degree, it is always said that the configuration file is introduced and other configurations do not need to be matched, but I can't do it after actually testing it. You still have to introduce configuration files + configuration parameter settings.
2. The second question based on future thinking is whether the content of the configuration file introduced can be multiple. According to the results of the degree of degree, it is OK. In order, it will be matched one by one until the match is successful. That is, the KafkaClient configuration blocks configured below can be multiple, just as the names are different. But due to the limitations of the conditions, I haven't tried it. 😀
Processing steps
Loading configuration
After a lot of articles, we mentioned that we need to introduce authentication configuration files and set properties when the service is started. There are two ways to do so.
a).When the service starts, use parameters to introduce the command as follows:
java -= (Fill in the specific address yourself, because my configuration file is the same as the directory of the same level, so there is no prefix) kafka_jaas_config.config -jar
b).Introduce it in the code, such as: (I didn't try this method)
(“”, “kafka_jaas_config.config”);
c).The content of the configuration file (note that the damn semicolon behind password is required, and it is also required when setting configuration parameters)
KafkaClient { required username="user" password="123"; };
Set authentication parameters (if any)
Similar parameters to be filled in are:
, (, and, I think these two are directly configured in the higher version. It is said that they can be authenticated and not needed, but the lower version is required, so it is recommended that you can match them both)
such as :
("", " required username=\"user\" password=\"123\";");
Tools
package ; import ; import ; import ; import ; import ; import ; import ; import ; import ; import ; import ; import ; /** * @Auther: */ @Configuration public class KafkaUtil { private static final ConcurrentHashMap<String, KafkaTemplate<String, String>> templateCache = new ConcurrentHashMap<>(); private Map<String, Object> kafkaProducerConfigs(String servers, Map<String, Object> otherConfigs) { Map<String, Object> props = new HashMap<>(); (ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, servers); (ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, ); (ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, ); if (otherConfigs != null){ (props::put); } return props; } public KafkaTemplate<String, String> getKafkaTemplate(String servers, Map<String, Object> otherConfigs) { return (servers, bs -> createKafkaTemplate(bs, otherConfigs)); } private KafkaTemplate<String, String> createKafkaTemplate(String servers, Map<String, Object> otherConfigs) { Map<String, Object> configs = kafkaProducerConfigs(servers, otherConfigs); ProducerFactory<String, String> producerFactory = new DefaultKafkaProducerFactory<>(configs); return new KafkaTemplate<>(producerFactory); } @PreDestroy public void destroy() { for (KafkaTemplate<String, String> template : ()) { (); } } }
How to use it
1. Just normal autowired
2. Check the configuration yourself and whether it requires authentication information
3. Get the template to send data
4. Processing results
Map<String,Object> configs = new HashMap<>(); //Fill in the configs yourselfKafkaTemplate kafkaTemplate = ("ip:port",configs); ("topic name", "Message Content");
nonsense
This is because I added a cache queue to facilitate the convenience of kafka's services, and if I don't need it, I can completely transform it myself to remove this part. If you have any questions you can improve, you can ask them. Welcome to pick them up.
This article about springboot using kafka to push data to the server, and the certification is introduced here. For more related springboot kafka push data content, please search for my previous articles or continue browsing the related articles below. I hope everyone will support me in the future!