You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Confluent Enterprise 6.1.1 Dev Environment on Ubuntu 21.04
OpenJDK 1.8.0_292
Installed Connectors:
JDBC Connector 10.1.1
Confluent SMT Transformations 1.3.2
Confluent Replicator 6.1.1
MariaDB JDBC Client 2.7.3
Schema Registry Transfer SMT 0.2.0
Steps to reproduce:
Start Confluent
Import a table from a MariaDB database in Avro Format
Use Confluent Replicator with Schema Registry Transfer SMT to replicate input test topic to a different output topic on the same cluster.
Following Exception is thrown afterwards
[2021-06-09 09:15:42,905] ERROR WorkerSourceTask{id=test.replicate-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:187) java.lang.NoClassDefFoundError: org/codehaus/jackson/JsonParseException at cricket.jmoore.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaByIdFromRegistry(CachedSchemaRegistryClient.java:124) at cricket.jmoore.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getBySubjectAndId(CachedSchemaRegistryClient.java:190) at cricket.jmoore.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getById(CachedSchemaRegistryClient.java:169) at cricket.jmoore.kafka.connect.transforms.SchemaRegistryTransfer.copySchema(SchemaRegistryTransfer.java:216) at cricket.jmoore.kafka.connect.transforms.SchemaRegistryTransfer.apply(SchemaRegistryTransfer.java:179) at org.apache.kafka.connect.runtime.TransformationChain.lambda$apply$0(TransformationChain.java:50) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:146) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:180) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:122) at org.apache.kafka.connect.runtime.TransformationChain.apply(TransformationChain.java:50) at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:339) at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:264) at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:185) at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:235) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.ClassNotFoundException: org.codehaus.jackson.JsonParseException at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:419) at org.apache.kafka.connect.runtime.isolation.PluginClassLoader.loadClass(PluginClassLoader.java:104) at java.lang.ClassLoader.loadClass(ClassLoader.java:352) ... 19 more
The text was updated successfully, but these errors were encountered:
Code is untested on CP 6.1.1, but I assume that it uses newer Jackson classes rather than the Jackson 1.x Codehaus classes that this project uses, so indeed they wouldn't exist on the classpath
You try and checkout and build the fork from #29 to get the upgraded JSON classes
Following setup:
Installed Connectors:
Steps to reproduce:
[2021-06-09 09:15:42,905] ERROR WorkerSourceTask{id=test.replicate-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:187) java.lang.NoClassDefFoundError: org/codehaus/jackson/JsonParseException at cricket.jmoore.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getSchemaByIdFromRegistry(CachedSchemaRegistryClient.java:124) at cricket.jmoore.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getBySubjectAndId(CachedSchemaRegistryClient.java:190) at cricket.jmoore.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getById(CachedSchemaRegistryClient.java:169) at cricket.jmoore.kafka.connect.transforms.SchemaRegistryTransfer.copySchema(SchemaRegistryTransfer.java:216) at cricket.jmoore.kafka.connect.transforms.SchemaRegistryTransfer.apply(SchemaRegistryTransfer.java:179) at org.apache.kafka.connect.runtime.TransformationChain.lambda$apply$0(TransformationChain.java:50) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:146) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:180) at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:122) at org.apache.kafka.connect.runtime.TransformationChain.apply(TransformationChain.java:50) at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:339) at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:264) at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:185) at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:235) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.ClassNotFoundException: org.codehaus.jackson.JsonParseException at java.net.URLClassLoader.findClass(URLClassLoader.java:382) at java.lang.ClassLoader.loadClass(ClassLoader.java:419) at org.apache.kafka.connect.runtime.isolation.PluginClassLoader.loadClass(PluginClassLoader.java:104) at java.lang.ClassLoader.loadClass(ClassLoader.java:352) ... 19 more
The text was updated successfully, but these errors were encountered: