All Downloads are FREE. Search and download functionalities are using the official Maven repository.

mework.cloud.spring-cloud-sleuth-docs.3.1.7.source-code.integrations.adoc Maven / Gradle / Ivy

[[sleuth-integration]]
= Spring Cloud Sleuth customization

include::_attributes.adoc[]

In this section, we describe how to customize various parts of Spring Cloud Sleuth. Please check the <> for the list of spans, tags and events.

[[sleuth-kafka-integration]]
== Apache Kafka

This feature is available for all tracer implementations.

We decorate the Kafka clients (`KafkaProducer` and `KafkaConsumer`) to create a span for each event that is produced or consumed. You can disable this feature by setting the value of `spring.sleuth.kafka.enabled` to `false`.

IMPORTANT: You have to register the `Producer` or `Consumer` as beans in order for Sleuth's auto-configuration to decorate them. When you then inject the beans, the expected type must be `Producer` or `Consumer` (and NOT e.g. `KafkaProducer`).

We also provide `TracingKafkaProducerFactory` and `TracingKafkaConsumerFactory` to be used with the https://projectreactor.io/docs/kafka/release/reference/[Reactor Kafka] clients (`KafkaSender` and `KafkaReceiver`, respectively). See an example in the snippet below:

[source,java,indent=0]
----
@Bean
KafkaReceiver reactiveKafkaReceiver(TracingKafkaConsumerFactory tracingKafkaConsumerFactory, KafkaReceiverOptions kafkaReceiverOptions) {
    return KafkaReceiver.create(tracingKafkaConsumerFactory, kafkaReceiverOptions);
}
----

Additionally, we decorate any https://docs.spring.io/spring-kafka/docs/current/reference/html/[Spring Kafka] `ProducerFactory` and `ConsumerFactory` available in the context. However, this is disabled if Brave instrumentation is on the classpath.

[[sleuth-async-integration]]
== Asynchronous Communication

In this section, we describe how to customize asynchronous communication with Spring Cloud Sleuth.

[[sleuth-async-annotation-integration]]
=== `@Async` Annotated methods

This feature is available for all tracer implementations.

In Spring Cloud Sleuth, we instrument async-related components so that the tracing information is passed between threads.
You can disable this behavior by setting the value of `spring.sleuth.async.enabled` to `false`.

If you annotate your method with `@Async`, we automatically modify the existing Span as follows:

* If the method is annotated with `@SpanName`, the value of the annotation is the Span's name.
* If the method is not annotated with `@SpanName`, the Span name is the annotated method name.
* The span is tagged with the method's class name and method name.

Since we're modifying the existing span, if you want to maintain its original name (e.g. a span created by receiving an HTTP request)
you should wrap your `@Async` annotated method with a `@NewSpan` annotation or create a new span manually.

[[sleuth-async-scheduled-integration]]
=== `@Scheduled` Annotated Methods

This feature is available for all tracer implementations.

In Spring Cloud Sleuth, we instrument scheduled method execution so that the tracing information is passed between threads.
You can disable this behavior by setting the value of `spring.sleuth.scheduled.enabled` to `false`.

If you annotate your method with `@Scheduled`, we automatically create a new span with the following characteristics:

* The span name is the annotated method name.
* The span is tagged with the method's class name and method name.

If you want to skip span creation for some `@Scheduled` annotated classes, you can set the `spring.sleuth.scheduled.skipPattern` with a regular expression that matches the fully qualified name of the `@Scheduled` annotated class.

[[sleuth-async-executor-service-integration]]
=== Executor, ExecutorService, and ScheduledExecutorService

This feature is available for all tracer implementations.

We provide `LazyTraceExecutor`, `TraceableExecutorService`, and `TraceableScheduledExecutorService`.
Those implementations create spans each time a new task is submitted, invoked, or scheduled.

The following example shows how to pass tracing information with `TraceableExecutorService` when working with `CompletableFuture`:

[source,java,indent=0]
----

include::{common_tests_path}/src/main/java/org/springframework/cloud/sleuth/instrument/async/TraceableExecutorServiceTests.java[tags=completablefuture,indent=0]
----

IMPORTANT: Sleuth does not work with `parallelStream()` out of the box.
If you want to have the tracing information propagated through the stream, you have to use the approach with `supplyAsync(...)`, as shown earlier.

If there are beans that implement the `Executor` interface that you would like to exclude from span creation, you can use the `spring.sleuth.async.ignored-beans`
property where you can provide a list of bean names.

You can disable this behavior by setting the value of `spring.sleuth.async.enabled` to `false`.

[[sleuth-async-executor-integration]]
==== Customization of Executors

Sometimes, you need to set up a custom instance of the `AsyncExecutor`.
The following example shows how to set up such a custom `Executor`:

[source,java,indent=0]
----
include::{common_tests_path}/src/main/java/org/springframework/cloud/sleuth/instrument/web/client/MultipleAsyncRestTemplateTests.java[tags=custom_executor,indent=0]
----

TIP: To ensure that your configuration gets post processed, remember to add the `@Role(BeanDefinition.ROLE_INFRASTRUCTURE)` on your
`@Configuration` class

[[sleuth-http-client-integration]]
== HTTP Client Integration

Features from this section can be disabled by setting the `spring.sleuth.web.client.enabled` property with value equal to `false`.

[[sleuth-http-client-rest-template-integration]]
=== Synchronous Rest Template

This feature is available for all tracer implementations.

We inject a `RestTemplate` interceptor to ensure that all the tracing information is passed to the requests.
Each time a call is made, a new Span is created.
It gets closed upon receiving the response.
To block the synchronous `RestTemplate` features, set `spring.sleuth.web.client.enabled` to `false`.

IMPORTANT: You have to register `RestTemplate` as a bean so that the interceptors get injected.
If you create a `RestTemplate` instance with a `new` keyword, the instrumentation does NOT work.

[[sleuth-http-client-async-rest-template-integration]]
=== Asynchronous Rest Template

This feature is available for all tracer implementations.

IMPORTANT: Starting with Sleuth `2.0.0`, we no longer register a bean of `AsyncRestTemplate` type.
It is up to you to create such a bean.
Then we instrument it.

To block the `AsyncRestTemplate` features, set `spring.sleuth.web.async.client.enabled` to `false`.
To disable creation of the default `TraceAsyncClientHttpRequestFactoryWrapper`, set `spring.sleuth.web.async.client.factory.enabled`
to `false`.
If you do not want to create `AsyncRestClient` at all, set `spring.sleuth.web.async.client.template.enabled` to `false`.

[[sleuth-http-client-multiple-async-rest-template-integration]]
==== Multiple Asynchronous Rest Templates

Sometimes you need to use multiple implementations of the Asynchronous Rest Template.
In the following snippet, you can see an example of how to set up such a custom `AsyncRestTemplate`:

[source,java,indent=0]
----
include::{common_tests_path}/src/main/java/org/springframework/cloud/sleuth/instrument/web/client/MultipleAsyncRestTemplateTests.java[tags=custom_async_rest_template,indent=0]
----

[[sleuth-http-client-webclient-integration]]
==== `WebClient`

This feature is available for all tracer implementations.

We inject a `ExchangeFilterFunction` implementation that creates a span and, through on-success and on-error callbacks, takes care of closing client-side spans.

To block this feature, set `spring.sleuth.web.client.enabled` to `false`.

IMPORTANT: You have to register `WebClient` as a bean so that the tracing instrumentation gets applied.
If you create a `WebClient` instance with a `new` keyword, the instrumentation does NOT work.

[[sleuht-http-client-logbook-with-webclient]]
===== Logbook with WebClient

In order to add support for Logbook with WebClient `org.zalando:logbook-spring-boot-webflux-autoconfigure` you need to add the following configuration. You can read more about this integration in https://github.com/spring-cloud/spring-cloud-sleuth/issues/1690[this issue].

[source,java,indent=0]
----
@Configuration
@Import(LogbookWebFluxAutoConfiguration.class)
public class LogbookConfiguration {

	@Bean
	public LogstashLogbackSink logbackSink(final HttpLogFormatter formatter) {
		return new LogstashLogbackSink(formatter);
	}

	@Bean
	public CorrelationId correlationId(final Tracer tracer) {
		return request -> requireNonNull(requireNonNull(tracer.currentSpan())).context().traceId();
	}

	@Bean
	ReactorNettyHttpTracing reactorNettyHttpTracing(final HttpTracing httpTracing) {
		return ReactorNettyHttpTracing.create(httpTracing);
	}

	@Bean
	NettyServerCustomizer nettyServerCustomizer(final Logbook logbook,
			final ReactorNettyHttpTracing reactorNettyHttpTracing) {
		return server -> reactorNettyHttpTracing.decorateHttpServer(
				server.doOnConnection(conn -> conn.addHandlerFirst(new LogbookServerHandler(logbook))));
	}

	@Bean
	WebClient webClient(final Logbook logbook, final ReactorNettyHttpTracing reactorNettyHttpTracing) {
		return WebClient.builder()
				.clientConnector(new ReactorClientHttpConnector(reactorNettyHttpTracing.decorateHttpClient(HttpClient
						.create().doOnConnected(conn -> conn.addHandlerLast(new LogbookClientHandler(logbook))))))
				.build();
	}

}
----

[[sleuth-http-client-traverson-integration]]
==== Traverson

This feature is available for all tracer implementations.

If you use the https://docs.spring.io/spring-hateoas/docs/current/reference/html/#client.traverson[Traverson] library, you can inject a `RestTemplate` as a bean into your Traverson object.
Since `RestTemplate` is already intercepted, you get full support for tracing in your client.
The following pseudo code shows how to do that:

[source,java,indent=0]
----
@Autowired RestTemplate restTemplate;

Traverson traverson = new Traverson(URI.create("https://some/address"),
    MediaType.APPLICATION_JSON, MediaType.APPLICATION_JSON_UTF8).setRestOperations(restTemplate);
// use Traverson
----

[[sleuth-http-client-apache-integration]]
==== Apache `HttpClientBuilder` and `HttpAsyncClientBuilder`

This feature is available for Brave tracer implementation.

We instrument the `HttpClientBuilder` and `HttpAsyncClientBuilder` so that tracing context gets injected to the sent requests.

To block these features, set `spring.sleuth.web.client.enabled` to `false`.

[[sleuth-http-client-netty-integration]]
==== Netty `HttpClient`

This feature is available for all tracer implementations.

We instrument the Netty's `HttpClient`.

To block this feature, set `spring.sleuth.web.client.enabled` to `false`.

IMPORTANT: You have to register `HttpClient` as a bean so that the instrumentation happens.
If you create a `HttpClient` instance with a `new` keyword, the instrumentation does NOT work.

[[sleuth-http-client-userinfo-integration]]
==== `UserInfoRestTemplateCustomizer`

This feature is available for all tracer implementations.

We instrument the Spring Security's `UserInfoRestTemplateCustomizer`.

To block this feature, set `spring.sleuth.web.client.enabled` to `false`.

[[sleuth-http-server-integration]]
== HTTP Server Integration

Features from this section can be disabled by setting the `spring.sleuth.web.enabled` property with value equal to `false`.

[[sleuth-http-server-http-filter-integration]]
=== HTTP Filter

This feature is available for all tracer implementations.

Through the `TracingFilter`, all sampled incoming requests result in creation of a Span.
You can configure which URIs you would like to skip by setting the `spring.sleuth.web.skipPattern` property.
If you have `ManagementServerProperties` on classpath, its value of `contextPath` gets appended to the provided skip pattern.
If you want to reuse the Sleuth's default skip patterns and just append your own, pass those patterns by using the `spring.sleuth.web.additionalSkipPattern`.

By default, all the spring boot actuator endpoints are automatically added to the skip pattern.
If you want to disable this behaviour set `spring.sleuth.web.ignore-auto-configured-skip-patterns`
to `true`.

To change the order of tracing filter registration, please set the
`spring.sleuth.web.filter-order` property.

To disable the filter that logs uncaught exceptions you can disable the
`spring.sleuth.web.exception-throwing-filter-enabled` property.

[[sleuth-http-server-handler-interceptor-integration]]
=== HandlerInterceptor

This feature is available for all tracer implementations.

Since we want the span names to be precise, we use a `TraceHandlerInterceptor` that either wraps an existing `HandlerInterceptor` or is added directly to the list of existing `HandlerInterceptors`.
The `TraceHandlerInterceptor` adds a special request attribute to the given `HttpServletRequest`.
If the the `TracingFilter` does not see this attribute, it creates a "`fallback`" span, which is an additional span created on the server side so that the trace is presented properly in the UI.
If that happens, there is probably missing instrumentation.
In that case, please file an issue in Spring Cloud Sleuth.

[[sleuth-http-server-async-integration]]
=== Async Servlet support

This feature is available for all tracer implementations.

If your controller returns a `Callable` or a `WebAsyncTask`, Spring Cloud Sleuth continues the existing span instead of creating a new one.

[[sleuth-http-server-webflux-integration]]
=== WebFlux support

This feature is available for all tracer implementations.

Through `TraceWebFilter`, all sampled incoming requests result in creation of a Span.
That Span's name is `http:` + the path to which the request was sent.
For example, if the request was sent to `/this/that`, the name is `http:/this/that`.
You can configure which URIs you would like to skip by using the `spring.sleuth.web.skipPattern` property.
If you have `ManagementServerProperties` on the classpath, its value of `contextPath` gets appended to the provided skip pattern.
If you want to reuse Sleuth's default skip patterns and append your own, pass those patterns by using the `spring.sleuth.web.additionalSkipPattern`.

In order to achieve best results in terms of performance and context propagation we suggest that you switch the `spring.sleuth.reactor.instrumentation-type` to `MANUAL`.
In order to execute code with the span in scope you can call `WebFluxSleuthOperators.withSpanInScope`.
Example:

[source,java,indent=0]
-----
include::{project-root}/benchmarks/src/main/java/org/springframework/cloud/sleuth/benchmarks/app/webflux/SleuthBenchmarkingSpringWebFluxApp.java[tags=simple_manual,indent=0]
-----

To change the order of tracing filter registration, please set the
`spring.sleuth.web.filter-order` property.

[[sleuth-reactor-netty-http-server-integration]]
=== Reactor Netty HttpServer

If you're using Reactor Netty and would like to have your access logs instrumented you need to add the `io.projectreactor.netty:reactor-netty-http-brave` (this will work only for the Brave Tracer). Also add the following configuration to your project.

[source,java,indent=0]
-----
import brave.http.HttpTracing;
import reactor.netty.http.brave.ReactorNettyHttpTracing;

@Configuration(proxyBeanMethods = false)
class TraceNettyConfig {

	@Bean
	NettyServerCustomizer traceNettyServerCustomizer(ObjectProvider tracing) {
		return server -> ReactorNettyHttpTracing.create(tracing.getObject()).decorateHttpServer(server);
	}
}
-----

[[sleuth-messaging-integration]]
== Messaging

Features from this section can be disabled by setting the `spring.sleuth.messaging.enabled` property with value equal to `false`.

[[sleuth-messaging-spring-integration-integration]]
=== Spring Integration

This feature is available for all tracer implementations.

Spring Cloud Sleuth integrates with https://projects.spring.io/spring-integration/[Spring Integration].
It creates spans for publish and subscribe events.
To disable Spring Integration instrumentation, set `spring.sleuth.integration.enabled` to `false`.

You can provide the `spring.sleuth.integration.patterns` pattern to explicitly provide the names of channels that you want to include for tracing.
By default, all channels but `hystrixStreamOutput` channel are included.

IMPORTANT: When using the `Executor` to build a Spring Integration `IntegrationFlow`, you must use the untraced version of the `Executor`.
Decorating the Spring Integration Executor Channel with `TraceableExecutorService` causes the spans to be improperly closed.

If you want to customize the way tracing context is read from and written to message headers, it's enough for you to register beans of types:

* `Propagator.Setter` - for writing headers to the message
* `Propagator.Getter` - for reading headers from the message

[[sleuth-messaging-spring-integration-customization]]
==== Spring Integration Customization

==== Customizing messaging spans

In order to change the default span names and tags, just register a bean of type `MessageSpanCustomizer`. You can also
override the existing `DefaultMessageSpanCustomizer` to extend the existing behaviour.

[source,java]
----
@Component
include::{common_tests_path}/src/main/java/org/springframework/cloud/sleuth/instrument/messaging/TracingChannelInterceptorTest.java[tags=message_span_customizer,indent=2]
----

[[sleuth-messaging-spring-cloud-function-integration]]
=== Spring Cloud Function and Spring Cloud Stream

This feature is available for all tracer implementations.

Spring Cloud Sleuth can instrument Spring Cloud Function. Since Spring Cloud Stream uses Spring Cloud Function
you will get the messaging instrumentation out of the box.

The way to achieve it is to provide a `Function` or `Consumer` or `Supplier` that takes in a `Message` as a parameter e.g. `Function, Message>`.
If the type **is not** `Message` then instrumentation **will not** take place.

For a reactive `Consumer>>` remember to manually close the span and clear the context before you call `.subscribe()`. Example:

[source,java,indent=0]
----
@Bean
	Consumer>> channel(Tracer tracer) {
		// For the reactive consumer remember to call "subscribe()" at the end, otherwise
		// you'll get the "Dispatcher has no subscribers" error
		return i -> i
					.doOnNext(s -> log.info("HELLO"))
					// You must finish the span yourself and clear the tracing context like presented below. 
					// Otherwise you will be missing out the span that wraps the function execution.
					.doOnNext(s -> {
						tracer.currentSpan().end();
						tracer.withSpan(null);
					})
					.subscribe();
	}
}
----

You can disable Spring Cloud Stream integration by setting the value of `spring.sleuth.function.enabled` to `false`.

If you want to fully control the life cycle of spans within the reactive messaging context of Spring Cloud Stream
remember to disable the Spring Cloud Stream integration and leverage the `MessagingSleuthOperators` utility 
class that allows you to manipulate the input and output messages in order to continue the tracing context and to execute custom code within the tracing context. 


[source,java,indent=0]
-----
include::{project-root}/benchmarks/src/main/java/org/springframework/cloud/sleuth/benchmarks/app/stream/SleuthBenchmarkingStreamApplication.java[tags=simple_reactive,indent=0]
-----

[[sleuth-messaging-spring-rabbitmq-integration]]
=== Spring RabbitMq

This feature is available for Brave tracer implementation.

We instrument the `RabbitTemplate` so that tracing headers get injected into the message.

To block this feature, set `spring.sleuth.messaging.rabbit.enabled` to `false`.

[[sleuth-messaging-spring-kafka-integration]]
=== Spring Kafka

This feature is available for Brave tracer implementation.

We instrument the Spring Kafka's `ProducerFactory` and `ConsumerFactory`
so that tracing headers get injected into the created Spring Kafka's
`Producer` and `Consumer`.

To block this feature, set `spring.sleuth.messaging.kafka.enabled` to `false`.

[[sleuth-messaging-spring-kafka-streams-integration]]
=== Spring Kafka Streams

This feature is available for Brave tracer implementation.

We instrument the `KafkaStreams` `KafkaClientSupplier` so that tracing headers get injected into the `Producer` and `Consumer`s. A `KafkaStreamsTracing` bean allows for further instrumentation through additional `TransformerSupplier` and
`ProcessorSupplier` methods.

To block this feature, set `spring.sleuth.messaging.kafka.streams.enabled` to `false`.

[[sleuth-messaging-spring-jms-integration]]
=== Spring JMS

This feature is available for Brave tracer implementation.

We instrument the `JmsTemplate` so that tracing headers get injected into the message.
We also support `@JmsListener` annotated methods on the consumer side.

To block this feature, set `spring.sleuth.messaging.jms.enabled` to `false`.

IMPORTANT: We don't support baggage propagation for JMS

[[sleuth-openfeign-integration]]
== OpenFeign

This feature is available for all tracer implementations.

By default, Spring Cloud Sleuth provides integration with Feign through `TraceFeignClientAutoConfiguration`.
You can disable it entirely by setting `spring.sleuth.feign.enabled` to `false`.
If you do so, no Feign-related instrumentation take place.

Part of Feign instrumentation is done through a `FeignBeanPostProcessor`.
You can disable it by setting `spring.sleuth.feign.processor.enabled` to `false`.
If you set it to `false`, Spring Cloud Sleuth does not instrument any of your custom Feign components.
However, all the default instrumentation is still there.

[[sleuth-opentracing-integration]]
== OpenTracing

This feature is available for all tracer implementations.

Spring Cloud Sleuth is compatible with https://opentracing.io/[OpenTracing].
If you have OpenTracing on the classpath, we automatically register the OpenTracing `Tracer` bean.
If you wish to disable this, set `spring.sleuth.opentracing.enabled` to `false`

[[sleuth-quartz-integration]]
== Quartz

This feature is available for all tracer implementations.

We instrument quartz jobs by adding Job/Trigger listeners to the Quartz Scheduler.

To turn off this feature, set the `spring.sleuth.quartz.enabled` property to `false`.

[[sleuth-reactor-integration]]
== Reactor

This feature is available for all tracer implementations.

We have the following modes of instrumenting reactor based applications that can be set via `spring.sleuth.reactor.instrumentation-type` property:

* `DECORATE_QUEUES` - With the new Reactor https://github.com/reactor/reactor-core/pull/2566[queue wrapping mechanism] (Reactor 3.4.3) we're instrumenting the way threads are switched by Reactor. This should lead to feature parity with `ON_EACH` with low performance impact.
* `DECORATE_ON_EACH` - wraps every Reactor operator in a trace representation.
Passes the tracing context in most cases.
This mode might lead to drastic performance degradation.
* `DECORATE_ON_LAST` - wraps last Reactor operator in a trace representation.
Passes the tracing context in some cases thus accessing MDC context might not work.
This mode might lead to medium performance degradation.
* `MANUAL` - wraps every Reactor in the least invasive way without passing of tracing context.
It's up to the user to do it.

Current default is `ON_EACH` for backward compatibility reasons, however we encourage the users to migrate to the `MANUAL` instrumentation and profit from `WebFluxSleuthOperators` and `MessagingSleuthOperators`.
The performance improvement can be substantial.
Example:

[source,java,indent=0]
-----
include::{project-root}/benchmarks/src/main/java/org/springframework/cloud/sleuth/benchmarks/app/webflux/SleuthBenchmarkingSpringWebFluxApp.java[tags=simple_manual,indent=0]
-----

To disable Reactor support, set the `spring.sleuth.reactor.enabled` property to `false`.

[[sleuth-redis-integration]]
== Redis

This feature is available for all tracer implementations.

We're using the `Tracing` abstraction from Lettuce. If Brave is on the classpath we configure `Tracing` to be `BraveTracing`.

To disable Redis support, set the `spring.sleuth.redis.enabled` property to `false`.

[[sleuth-redis-legacy-integration]]
=== Redis With Legacy Brave Only Support

To use the Brave only supported feature you need to set the value of `spring.sleuth.redis.legacy.enabled` to `true`. This is the default mechanism available up till version 3.1.0 of Spring Cloud Sleuth.

We set `tracing` property to Lettuce `ClientResources` instance to enable Brave tracing built in Lettuce.

Spring Cloud Sleuth will provide a traced version of the `ClientResources` bean. If you have your own implementation of that bean, remember to customize the `ClientResources.Builder` with a stream of `ClientResourcesBuilderCustomizer`s like presented below:

[source,java,indent=0]
----
	@Bean(destroyMethod = "shutdown")
	DefaultClientResources myLettuceClientResources(ObjectProvider customizer) {
		DefaultClientResources.Builder builder = DefaultClientResources.builder();
		// setting up the builder manually
		customizer.stream().forEach(c -> c.customize(builder));
		return builder.build();
	}
----

[[sleuth-runnablecallable-integration]]
== Runnable and Callable

This feature is available for all tracer implementations.

If you wrap your logic in `Runnable` or `Callable`, you can wrap those classes in their Sleuth representative, as shown in the following example for `Runnable`:

[source,java,indent=0]
----
include::{brave_path}/src/test/java/org/springframework/cloud/sleuth/brave/SpringCloudSleuthDocTests.java[tags=trace_runnable,indent=0]
----

The following example shows how to do so for `Callable`:

[source,java,indent=0]
----
include::{brave_path}/src/test/java/org/springframework/cloud/sleuth/brave/SpringCloudSleuthDocTests.java[tags=trace_callable,indent=0]
----

That way, you ensure that a new span is created and closed for each execution.

[[sleuth-rpc-integration]]
== RPC

This feature is available for Brave tracer implementation.

Sleuth automatically configures the `RpcTracing` bean which serves as a foundation for RPC instrumentation such as gRPC or Dubbo.

If a customization of client / server sampling of the RPC traces is required, just register a bean of type `brave.sampler.SamplerFunction` and name the bean `sleuthRpcClientSampler` for client sampler and
`sleuthRpcServerSampler` for server sampler.

For your convenience the `@RpcClientSampler` and `@RpcServerSampler`
annotations can be used to inject the proper beans or to reference the bean names via their static String `NAME` fields.

Ex.
Here's a sampler that traces 100 "GetUserToken" server requests per second.
This doesn't start new traces for requests to the health check service.
Other requests will use the global sampling configuration.

[source,java,indent=0]
----
@Configuration(proxyBeanMethods = false)
	class Config {
include::{autoconfig_path}/src/test/java/org/springframework/cloud/sleuth/autoconfig/brave/instrument/rpc/BraveRpcAutoConfigurationIntegrationTests.java[tags=custom_rpc_server_sampler,indent=2]
}
----

For more, see https://github.com/openzipkin/brave/tree/master/instrumentation/rpc#sampling-policy

[[sleuth-rpc-dubbo-integration]]
=== Dubbo RPC support

Via the integration with Brave, Spring Cloud Sleuth supports https://dubbo.apache.org/[Dubbo].
It's enough to add the `brave-instrumentation-dubbo` dependency:

[source,xml,indent=0]
----

    io.zipkin.brave
    brave-instrumentation-dubbo

----

You need to also set a `dubbo.properties` file with the following contents:

```properties
dubbo.provider.filter=tracing
dubbo.consumer.filter=tracing
```

You can read more about Brave - Dubbo integration https://github.com/openzipkin/brave/tree/master/instrumentation/dubbo-rpc[here].
An example of Spring Cloud Sleuth and Dubbo can be found https://github.com/openzipkin/sleuth-webmvc-example/compare/add-dubbo-tracing[here].

[[sleuth-rpc-grpc-integration]]
=== gRPC

Spring Cloud Sleuth provides instrumentation for https://grpc.io/[gRPC] via the Brave tracer.
You can disable it entirely by setting `spring.sleuth.grpc.enabled` to `false`.

[[sleuth-rpc-grpc-variant1-integration]]
==== Variant 1

[[sleuth-rpc-grpc-variant1-dependencies-integration]]
===== Dependencies

IMPORTANT: The gRPC integration relies on two external libraries to instrument clients and servers and both of those libraries must be on the class path to enable the instrumentation.

Maven:

```
		
			io.github.lognet
			grpc-spring-boot-starter
		
		
			io.zipkin.brave
			brave-instrumentation-grpc
		
```

Gradle:

```
    compile("io.github.lognet:grpc-spring-boot-starter")
    compile("io.zipkin.brave:brave-instrumentation-grpc")
```

[[sleuth-rpc-grpc-variant1-server-integration]]
===== Server Instrumentation

Spring Cloud Sleuth leverages grpc-spring-boot-starter to register Brave's gRPC server interceptor with all services annotated with `@GRpcService`.

[[sleuth-rpc-grpc-variant1-client-integration]]
===== Client Instrumentation

gRPC clients leverage a `ManagedChannelBuilder` to construct a `ManagedChannel` used to communicate to the gRPC server.
The native `ManagedChannelBuilder` provides static methods as entry points for construction of `ManagedChannel` instances, however, this mechanism is outside the influence of the Spring application context.

IMPORTANT: Spring Cloud Sleuth provides a `SpringAwareManagedChannelBuilder` that can be customized through the Spring application context and injected by gRPC clients.
*This builder must be used when creating `ManagedChannel` instances.*

Sleuth creates a `TracingManagedChannelBuilderCustomizer` which inject Brave's client interceptor into the `SpringAwareManagedChannelBuilder`.

[[sleuth-rpc-grpc-variant2-integration]]
==== Variant 2

https://github.com/yidongnan/grpc-spring-boot-starter[Grpc Spring Boot Starter] automatically detects the presence of Spring Cloud Sleuth and Brave's instrumentation for gRPC and registers the necessary client and/or server tooling.

[[sleuth-rxjava-integration]]
== RxJava

This feature is available for all tracer implementations.

We registering a custom https://github.com/ReactiveX/RxJava/wiki/Plugins#rxjavaschedulershook[`RxJavaSchedulersHook`] that wraps all `Action0` instances in their Sleuth representative, which is called `TraceAction`.
The hook either starts or continues a span, depending on whether tracing was already going on before the Action was scheduled.
To disable the custom `RxJavaSchedulersHook`, set the `spring.sleuth.rxjava.schedulers.hook.enabled` to `false`.

You can define a list of regular expressions for thread names for which you do not want spans to be created.
To do so, provide a comma-separated list of regular expressions in the `spring.sleuth.rxjava.schedulers.ignoredthreads` property.

IMPORTANT: The suggested approach to reactive programming and Sleuth is to use the Reactor support.

[[sleuth-circuitbreaker-integration]]
== Spring Cloud CircuitBreaker

This feature is available for all tracer implementations.

If you have Spring Cloud CircuitBreaker on the classpath, we will wrap the passed command `Supplier` and the fallback `Function` in its trace representations. We will also instrument the reactive implementation of the CircuitBreaker.
In order to disable this instrumentation set `spring.sleuth.circuitbreaker.enabled` to `false`.

[[sleuth-config-server-integration]]
== Spring Cloud Config Server

This feature is available for all tracer implementations.

If you have Spring Cloud Config Server running on the classpath, we will wrap the `EnvironmentRepository` in a span.
In order to disable this instrumentation set `spring.sleuth.config.server.enabled` to `false`.

[[sleuth-deployer-integration]]
== Spring Cloud Deployer

This feature is available for all tracer implementations.

If you have Spring Cloud Deployer running on the classpath, we wrap the `AppDeployer` in a trace representation. We are polling the application for its status at a default interval. You can change that default by setting the `spring.sleuth.deployer.status-poll-delay` property.
In order to disable this instrumentation set `spring.sleuth.deployer.enabled` to `false`.

[[sleuth-rsocket-integration]]
== Spring RSocket

This feature is available for all tracer implementations.

If you have Spring RSocket running on the classpath, we wrap the inbound and outbound communication to propagate the tracing context via the metadata.
In order to disable this instrumentation set `spring.sleuth.rsocket.enabled` to `false`.

[[sleuth-batch-integration]]
== Spring Batch

This feature is available for all tracer implementations.

If you have Spring Batch running on the classpath, we wrap the `StepBuilderFactory` and the `JobBuilderFactory` to propagate the tracing context.
In order to disable this instrumentation set `spring.sleuth.batch.enabled` to `false`.

[[sleuth-task-integration]]
== Spring Cloud Task

This feature is available for all tracer implementations.

If you have Spring Cloud Task running on the classpath, we're instrumenting `TaskExecutionListener` and `CommandLineRunner` and `ApplicationRunner`.
In order to disable this instrumentation set `spring.sleuth.task.enabled` to `false`.

[[sleuth-tx-integration]]
== Spring Tx

This feature is available for all tracer implementations.

If you have Spring Tx on the classpath we will instrument the `PlatformTransactionManager` and the `ReactiveTransactionManager` to create a span whenever a new transaction is created. Due to technical constraints we will not instrument classes that extend Spring's `AbstractPlatformTransactionManager`.
In order to disable this instrumentation set `spring.sleuth.tx.enabled` to `false`.

[[sleuth-security-integration]]
== Spring Security

This feature is available for all tracer implementations.

If you have Spring Security on the classpath, we create an implementation of `SecurityContextChangedListener` that annotates a current span with an event when context has changed.
In order to disable this instrumentation set `spring.sleuth.security.enabled` to `false`.

[[sleuth-r2dbc-integration]]
== R2DBC

This feature is available for all tracer implementations.

If you have R2DBC Proxy on the classpath we will instrument the `ConnectionFactory`so that it contains a custom `ProxyExecutionListener`.
In order to disable this instrumentation set `spring.sleuth.r2dbc.enabled` to `false`.

[[sleuth-vault-integration]]
== Spring Vault

This feature is available for all tracer implementations.

We're instrumenting the `RestTemplate` or `WebClient` instances used by Spring Vault to communicate with Vault.
In order to disable this instrumentation set `spring.sleuth.vault.enabled` to `false`.

[[sleuth-tomcat-integration]]
== Spring Tomcat

This feature is available for all tracer implementations.

We're adding an instrumented Tomcat's `Valve` that originates the span.
In order to disable this instrumentation set `spring.sleuth.web.tomcat.enabled` to `false`.

[[sleuth-cassandra-integration]]
== Spring Data Cassandra

This feature is available for all tracer implementations.

We're instrumenting Casandra's `CqlSession` and `ReactiveSession` interfaces and we're providing our own implementation of the `RequestTracker`.
In order to disable this instrumentation set `spring.sleuth.cassandra.enabled` to `false`.

[[sleuth-jdbc-integration]]
== Spring JDBC

This feature is available for all tracer implementations. It has been ported from the https://github.com/gavlyukovskiy/spring-boot-data-source-decorator/[spring-boot-datasource-decorator] project.

We're decorating  `DataSource`s in a trace representation. We delegate actual proxying to either https://github.com/p6spy/p6spy[p6spy] or https://github.com/ttddyy/datasource-proxy[datasource-proxy]. In order to use this feature you need to have them on the classpath.

====
[source,xml,indent=0,subs="verbatim,attributes",role="primary"]
.P6Spy Maven
----

    p6spy
    p6spy
    ${p6spy.version}
    runtime

----

[source,yml,indent=0,subs="verbatim,attributes",role="secondary"]
.P6Spy Gradle
----
runtimeOnly "p6spy:p6spy:${p6spyVersion}"
----

[source,xml,indent=0,subs="verbatim,attributes",role="secondary"]
.Datasource Proxy Maven
----

    net.ttddyy
    datasource-proxy
    ${datasource-proxy.version}
    runtime

----

[source,groovy,indent=0,subs="verbatim,attributes",role="secondary"]
.Datasource Proxy Gradle
----
runtimeOnly "net.ttddyy:datasource-proxy:${datasourceProxyVersion}"
----
====

Please check the <> page under `spring.sleuth.jdbc.p6spy` for all p6spy configuration options and `spring.sleuth.jdbc.datasource-proxy` for all datasource proxy configuration options.

For P6Spy by default logging parameter values will be disabled, set `spring.sleuth.jdbc.p6spy.tracing.include-parameter-values` to `true` to enable it.

You can configure P6Spy manually using one of available configuration methods. For more information please refer to the http://p6spy.readthedocs.io/en/latest/configandusage.html[P6Spy Configuration Guide].

For Datasource Proxy by default logging queries will be disabled, set `spring.sleuth.jdbc.datasource-proxy.slow-query.enable-logging` to `true` to enable logging slow queries
and set `spring.sleuth.jdbc.datasource-proxy.query.enable-logging` to `true` to enable logging all queries.

In order to disable this instrumentation set `spring.sleuth.jdbc.enabled` to `false`.

[[sleuth-mongodb-integration]]
== MongoDB

This feature is available for all tracer implementations.

We're adding command listeners that wrap all commands in a span.
If you want to have additional socket address related tags on the span set the `spring.sleuth.mongodb.socket-address-span-customizer.enabled` to `true`.

In order to disable this instrumentation set ``spring.sleuth.mongodb.enabled`` to `false`.

[[sleuth-session-integration]]
== Spring Session

This feature is available for all tracer implementations.

We're instrumenting the `Session` repositories that wraps all operations in a span.
In order to disable this instrumentation set `spring.sleuth.session.enabled` to `false`.

[[sleuth-kotlin-integration]]
== Kotlin Coroutines

This feature is available for all tracer implementations.

We're adding Kotlin Coroutines that allow you to retrieve the current span via the `Tracer` bean. You can either pass the bean to the Kotlin Coroutine context via `Tracer.asContextElement()` method execution or if you have Reactor Kotlin Coroutine integration on the classpath, we will retrieve it from Reactor's context. To retrieve the current span you can call the `currentSpan()` method within the Kotlin Coroutine.

[[sleuth-prometheus-exemplars-integration]]
== Prometheus Exemplars

This feature is available for all tracer implementations.

https://prometheus.io/docs/prometheus/latest/feature_flags/#exemplars-storage[Prometheus Exemplars] are supported through `SpanContextSupplier`. If you use https://micrometer.io[Micrometer], this will be auto-configured for you, but you can register `SpanContextSupplier` directly to Prometheus if you want. +
Please check the https://prometheus.io/docs/prometheus/latest/feature_flags/#exemplars-storage[Prometheus Docs], since this feature needs to be explicitly enabled on Prometheus' side, and it is only supported using the https://github.com/OpenObservability/OpenMetrics/blob/v1.0.0/specification/OpenMetrics.md#exemplars[OpenMetrics] format.




© 2015 - 2024 Weber Informatics LLC | Privacy Policy