Spring certified China Education Management Center - Spring Data MongoDB tutorial 15

Original title: Spring certified China Education Management Center - Spring Data MongoDB tutorial 15 (content source: Spring China Education Management Center)

18.7.1. Save using the registered Spring converter

The following example shows the implementation org.bson.Document of Converter converting from Person object to a:

import org.springframework.core.convert.converter.Converter;

import org.bson.Document;

public class PersonWriteConverter implements Converter<Person, Document> {

  public Document convert(Person source) {
    Document document = new Document();
    document.put("_id", source.getId());
    document.put("name", source.getFirstName());
    document.put("age", source.getAge());
    return document;

18.7.2. Using Spring converter to read

The following example shows the a implementation of the Converter from a document to a Person object:

public class PersonReadConverter implements Converter<Document, Person> {

  public Person convert(Document source) {
    Person p = new Person((ObjectId) source.get("_id"), (String) source.get("name"));
    p.setAge((Integer) source.get("age"));
    return p;

18.7.3. Register the Spring converter MongoConverter

class MyMongoConfiguration extends AbstractMongoClientConfiguration {

	public String getDatabaseName() {
		return "database";

	protected void configureConverters(MongoConverterConfigurationAdapter adapter) {
		adapter.registerConverter(new com.example.PersonReadConverter());
		adapter.registerConverter(new com.example.PersonWriteConverter());

The following spring converter implementation example converts from aString to a custom Email value object:

public class EmailReadConverter implements Converter<String, Email> {

  public Email convert(String source) {
    return Email.valueOf(source);

If the source type and target type of the Converter you write are both native types, we cannot determine whether it should be treated as a read Converter or a write Converter. Registering Converter instances as both may result in unwanted results. For example, aconverter < String, Long > is ambiguous, although it may not be meaningful to try to convert all String instances to Long instances at writing time There is only one way for you to force the infrastructure to register a Converter. We provide @ ReadingConverter and @ writengconverter to use annotations in the Converter implementation.

The converter needs to be explicitly registered because instances will not be extracted from the classpath or container scan to avoid unnecessary transformation service registration and the side effects of such registration. The converter registers CustomConversions as a central tool that allows registered converters to be registered and queried according to source and target types.

CustomConversions comes with a predefined set of converter registrations:

  • JSR-310 converter for converting between java.time,java.util.Date and String types.
  • Not recommended: Joda time converter, used at org.joda.time, JSR-310 and java.util.Date
  • Deprecated: ThreeTenBackport converter for use at org.joda.time, JSR-310, and java.util.Date

The default converter for local time types (such as LocalDateTimeto java.util.Date) depends on the system default time zone settings to convert between these types. You can override the default converter by registering your own converter.

Converter disambiguation

Generally, we will check the implementation of the source and target types of converters they convert to each other. Depending on whether one of them is a type that can be processed locally by the underlying data access API, we register the Converter instance as a read or write Converter. The following example shows a write and read Converter (note that the difference is the order of qualifiers on the Converter):

// Write converter as only the target type is one that can be handled natively
class MyConverter implements Converter<Person, String> { ... }

// Read converter as only the source type is one that can be handled natively
class MyConverter implements Converter<String, Person> { ... }

19. Slicing

MongoDB supports large data sets through sharding. Sharding is a method of distributing data across multiple database servers. Please refer to MongoDB documentation to learn how to set up sharding clusters and their requirements and limitations.

Spring Data MongoDB uses the @ Sharded annotation to identify the entities stored in the Sharded collection, as shown below.

@Sharded(shardKey = { "country", "userId" }) 
public class User {

	Long id;

	String userId;

	String country;

The properties of the shard key are mapped to the actual field name.

19.1. Fragment collection

Spring Data MongoDB does not automatically set sharding for its required collection or index. The following code snippet shows how to do this using the MongoDB client API.

MongoDatabase adminDB = template.getMongoDbFactory()

adminDB.runCommand(new Document("enableSharding", "db"));           

Document shardCmd = new Document("shardCollection", "db.users")     
	.append("key", new Document("country", 1).append("userid", 1)); 


You need to run the sharding command against the admin database.

If necessary, enable sharding for a specific database.

A sharded collection in a sharded database.

Specifies the sharding key. This example uses range based sharding.

19.2. Slice key processing

The sharding key consists of one or more attributes that must exist in each document in the target collection. It is used to distribute documents across Shards.

Adding the @ Sharded annotation to the entity enables Spring Data MongoDB to apply the best effort optimization required for the sharding scenario. This means essentially adding the required sharding key information (if it does not already exist) to replaceOne to filter the query when updating the entity. This may require additional server round trips to determine the actual value of the current sharding key.

By setting @ sharded (immutablekey = true), spring data does not attempt to check whether the entity sharding key has changed.

For more details, please refer to the MongoDB documentation. The following list contains which operations meet the conditions for automatic inclusion of sharding keys:

  • (Reactive)CrudRepository.save(...)
  • (Reactive)CrudRepository.saveAll(...)
  • (Reactive)MongoTemplate.save(...)

20. Kotlin support

Kotlin is a statically typed language for JVM s (and other platforms), which allows you to write concise and elegant code while providing excellent interoperability with existing libraries written in Java.

Spring Data provides first-class support for Kotlin, allowing developers to write Kotlin applications almost like writing Kotlin's native framework.

The easiest way to use Kotlin to build Spring applications is to use Spring Boot and its dedicated Kotlin support. This comprehensive tutorial will teach you how to use start.spring.io to build Spring Boot applications with Kotlin.

20.1. Requirements

Spring Data supports Kotlin 1.3 and requires Kotlin stdlib (or one of its variants, such as Kotlin stdlib jdk8) and Kotlin reflect to exist in the classpath. If you boot the Kotlin project through start.spring.io, these are provided by default.

20.2. Zero safety

One of the key features of Kotlin is null security, which cleanly handles null values at compile time. This makes the application more secure through the expression of nullability declarations and "value or no value" semantics without paying the cost of wrappers, such as Optional. (Kotlin allows the use of functional constructs with nullable values. Please refer to the Kotlin null security comprehensive guide.)

Although Java does not allow you to express null security in its type system, the Spring Data API annotates org.springframework.lang with JSR-305 tool friendly annotations declared in the package. By default, types from Java API s used in kotlin are recognized as platform types and null checked. Kotlin's support for JSR-305 annotations and Spring nullability annotations It provides kotlin developers with the empty security of the whole Spring Data API, which has the advantage of handling related problems at compile time.

See how the null handling of repository methods applies null security to the Spring data repository.

You can configure JSR-305 checking by adding a compiler flag with the following options: - Xjsr305={strict|warn|ignore}

For kotlin version 1.1 +, the default behavior is - Xjsr305=warn. strict. This value is required in consideration of the null security of the Spring Data API. Kotlin types are inferred from the Spring API, but you should know that the null declaration of the Spring API can evolve even between minor versions, and more checks may be added in the future.

Generic type parameters, variable parameters, and nullability of array elements are not yet supported, but should be provided in an upcoming release.

20.3. Object mapping

For details on how Kotlin objects are materialized, see Kotlin support.

20.4. Extension

The Kotlin extension provides the ability to extend existing classes with additional functionality. The Spring Data Kotlin API uses these extensions to add new Kotlin specific convenience to the existing Spring API.

Remember, you need to import the Kotlin extension to use it. Like static imports, the IDE should automatically recommend imports in most cases. For example, the Kotlin reified type parameter provides a solution for JVM generic type erasure, and Spring Data provides some extensions to take advantage of this functionality. This allows better kotlin APIs.

To retrieve a list of objects in Java with SWCharacter, you usually write the following:

Flux<SWCharacter> characters  = template.find(SWCharacter.class).inCollection("star-wars").all()

Using the Kotlin and Spring Data extensions, you can write the following instead:

val characters = template.find<SWCharacter>().inCollection("star-wars").all()
// or (both are equivalent)
val characters : Flux<SWCharacter> = template.find().inCollection("star-wars").all()

In Java, characters Kotlin is strongly typed, but Kotlin's clever type inference allows shorter syntax.

Spring Data MongoDB provides the following extensions:

  • Materialized generics support MongoOperations, ReactiveMongoOperations, FluentMongoOperations, ReactiveFluentMongoOperations, and Criteria.
  • Kotlin's type safe query
  • ReactiveFluentMongoOperations is an extension of the collaborative process.

20.5. Coordination process

Kotlin coroutines are lightweight threads that allow you to force non blocking code. In terms of language, the suspend function provides abstraction for asynchronous operations, while in terms of library, kotlinx.coroutines provides async {} like Flow

The Spring Data module provides support for collaborative processes in the following scope:

  • Delay and stream return value support in Kotlin extension

20.5.1. Dependencies

Enable kotlinx coroutines core when supporting cooperative programs, Kotlinx coroutines reactive and kotlinx coroutines reactor depends on the classpath:

Example 211. Dependencies added in Maven pom.xml




Supported version 1.3.0 and above.

20.5.2. How can the reaction be transformed into a synergistic process?

For the return value, the conversion from Reactive to Coroutines API is as follows:

  • Fund handler(): mono < void > becomes suspend fund handler()
  • Fund handler(): Mono < T > becomes suspend fund handler(): T or suspend fund handler(): t? Depends on whether Mono can be null (with the advantage of a more static type)
  • Fun handler(): flux < T > becomes fun handler(): flow < T >

FlowFlux is equivalent in the Coroutines world and is applicable to heat flow or cold flow, finite flow or infinite flow. The main differences are as follows:

  • Flow is based on push, while Flux is mixed by push and pull
  • The back pressure is achieved through the hang function
  • Flow has only one pending collect method, and the operator is implemented as an extension
  • Operators are easy to implement due to concurrency
  • The extension allows you to add a custom operator Flow
  • Collection operation pausing feature
  • The map operator supports asynchronous operations (flatMap is not required) because it requires a pending function parameter

Read this blog post about Going Reactive with Spring, Coroutines and Kotlin Flow for more details, including how to run code concurrently with Coroutines.

20.5.3. Repository

This is an example of a Coroutines Repository:

interface CoroutineRepository : CoroutineCrudRepository<User, String> {

    suspend fun findOne(id: String): User

    fun findByFirstname(firstname: String): Flow<User>

    suspend fun findAllByFirstname(id: String): List<User>

The collaboration repository is built on a reactive repository to expose the non blocking characteristics of data access through Kotlin's collaboration. Methods on the coroutine repository can be supported by query methods or custom implementations. If the custom method is callable, calling the custom implementation method will propagate the Coroutines call to the actual implementation method, suspend without the implementation method returning the reaction type, such as Mono or Flux.

The coroutine repository is discovered only when the repository extends the CoroutineCrudRepository interface.

21. JMX support

MongoDB's JMX support exposes the results of running the "serverStatus" command on the management database of a single MongoDB server instance. It also exposes an administrative MBean, and MongoAdmin allows you to perform administrative operations, such as deleting or creating a database. JMX functionality builds on the JMX feature set available in the Spring Framework. See here for more details.

21.1.MongoDB JMX configuration

Spring's Mongo namespace allows you to enable JMX functionality, as shown in the following example:

Example 212. Configuring the XML schema of MongoDB

<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
    http://www.springframework.org/schema/beans https://www.springframework.org/schema/beans/spring-beans-3.0.xsd">

    <!-- Default bean name is 'mongo' -->
    <mongo:mongo-client host="localhost" port="27017"/>

    <!-- by default look for a Mongo object named 'mongo' -->


    <!-- To translate any MongoExceptions thrown in @Repository annotated classes -->

    <bean id="registry" class="org.springframework.remoting.rmi.RmiRegistryFactoryBean" p:port="1099" />

    <!-- Expose JMX over RMI -->
    <bean id="serverConnector" class="org.springframework.jmx.support.ConnectorServerFactoryBean"
        p:serviceUrl="service:jmx:rmi://localhost/jndi/rmi://localhost:1099/myconnector" />


The previous code exposes several MBean s:

  • AssertMetrics
  • BackgroundFlushingMetrics
  • BtreeIndexCounters
  • ConnectionMetrics
  • GlobalLockMetrics
  • MemoryMetrics
  • OperationCounters
  • ServerInfo
  • MongoAdmin

The following screenshot of JConsole shows the generated configuration:

Posted on Mon, 29 Nov 2021 17:38:00 -0500 by sarika