Enable the spring cloud openfeign configuration to refresh. The project cannot be started. I'm stupid

This article involves the underlying design, principles and problem positioning, which is relatively in-depth and long, so it is divided into two parts:

  • Above: a brief description of the problem and the principle of Spring Cloud RefreshScope
  • Next: the current bug caused by spring cloud openfeign + spring cloud sleuth and how to fix it

In the recent project, if you want to implement OpenFeign configuration, you can dynamically refresh (mainly Feign Options configuration), for example:

feign:
    client:
     config:
       default:
         # Link timeout
         connectTimeout: 500
         # Read timeout
         readTimeout: 8000

We may observe that the timeout for calling a FeignClient is unreasonable and needs to be modified temporarily. We don't want to restart the process or refresh the whole ApplicationContext because of this, so we put this part of the configuration into spring cloud config and refresh it using the dynamic refresh mechanism. This configuration method is officially provided. Refer to: Official documentation - Spring @RefreshScope Support

That is, add configuration in the project:

feign.client.refresh-enabled: true

However, in our project, after adding this configuration, the startup fails and the error that the relevant Bean cannot be found is reported:

Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No bean named 'feign.Request.Options-testService1Client' available
	at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanDefinition(DefaultListableBeanFactory.java:863)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getMergedLocalBeanDefinition(AbstractBeanFactory.java:1344)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:309)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:213)
	at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1160)
	at org.springframework.cloud.openfeign.FeignContext.getInstance(FeignContext.java:57)
	at org.springframework.cloud.openfeign.FeignClientFactoryBean.getOptionsByName(FeignClientFactoryBean.java:363)
	at org.springframework.cloud.openfeign.FeignClientFactoryBean.configureUsingConfiguration(FeignClientFactoryBean.java:195)
	at org.springframework.cloud.openfeign.FeignClientFactoryBean.configureFeign(FeignClientFactoryBean.java:158)
	at org.springframework.cloud.openfeign.FeignClientFactoryBean.feign(FeignClientFactoryBean.java:132)
	at org.springframework.cloud.openfeign.FeignClientFactoryBean.getTarget(FeignClientFactoryBean.java:382)
	at org.springframework.cloud.openfeign.FeignClientFactoryBean.getObject(FeignClientFactoryBean.java:371)
	at org.springframework.cloud.openfeign.FeignClientsRegistrar.lambda$registerFeignClient$0(FeignClientsRegistrar.java:235)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.obtainFromSupplier(AbstractAutowireCapableBeanFactory.java:1231)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1173)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:564)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:524)
	... 74 more

problem analysis

Through the Bean name, we can actually see that this Bean is Feign.Options that we first mentioned to be refreshed dynamically. There are connection timeout, read timeout and other configurations. The part after the name is the contextId in the @ FeignClient annotation above the FeignClient we created.

When creating FeignClient, you need to load this Feign.Options Bean. Each FeignClient has its own ApplicationContext. This Feign.Options Bean belongs to a separate ApplicationContext for each FeignClient. This is implemented through the NamedContextFactory of Spring Cloud. For in-depth analysis of NamedContextFactory, please refer to my article:

For the configuration of OpenFeign, to enable dynamic refresh is actually to refresh the Feign.Options Bean of each feign client. So how to achieve it? Let's first look at the implementation of spring cloud's dynamic refresh Bean. First, we need to find out what Scope is.

Scope of Bean

Literally, Scope is the Scope of a Bean. From the implementation above, Scope refers to how we obtain a Bean when we obtain it.

The Spring framework comes with two familiar scopes, singleton and prototype. Singleton means that every time a Bean is obtained from BeanFactory (getBean), the same object is returned for the same Bean every time, that is, singleton mode. Prototype means that every time a Bean is obtained from the BeanFactory, a new object is created and returned for the same Bean every time, that is, the factory mode.

At the same time, we can also extend the Scope according to our own needs and define the way to obtain beans. For a simple example, we customize a TestScope. For a custom Scope, you need to define a class that implements the org.springframework.beans.factory.config.Scope interface to define the operations related to obtaining beans under this Scope.

public interface Scope {
    //Get this bean, which will be called when BeanFactory.getBean
    Object get(String name, ObjectFactory<?> objectFactory);
    //This method is called when calling BeanFactory.destroyScopedBean
    @Nullable
	Object remove(String name);
	//Register callback for destroy
	//This is an optional implementation that provides a callback to the external registration destruction bean. You can execute the callback passed in here when you remove.
	void registerDestructionCallback(String name, Runnable callback);
	//If a bean is not in the BeanFactory, but is created according to the context, for example, an independent bean is created for each http request, so it cannot be obtained from the BeanFactory, it will be obtained from here
	//This is also an optional implementation
	Object resolveContextualObject(String key);
	//Optional implementation, similar to session id, which allows users to distinguish between different contexts
	String getConversationId();
}

Let's implement a simple Scope:

public static class TestScope implements Scope {
    @Override
    public Object get(String name, ObjectFactory<?> objectFactory) {
        return objectFactory.getObject();
    }
    @Override
    public Object remove(String name) {
        return null;
    }
    @Override
    public void registerDestructionCallback(String name, Runnable callback) {
    }
    @Override
    public Object resolveContextualObject(String key) {
        return null;
    }
    @Override
    public String getConversationId() {
        return null;
    }
}

This Scope only implements the get method. Create a new Bean directly from the passed in objectFactory. Each call to BeanFactory.getFactory under this Scope will return a new Bean. The beans automatically loaded into different beans under this Scope are also different instances. Write test:

@Configuration
public static class Config {
    @Bean
    //The name of the custom Scope is testScope
    @org.springframework.context.annotation.Scope(value = "testScope")
    public A a() {
        return new A();
    }
    //Auto load in
    @Autowired
    private A a;
}

public static class A {
    public void test() {
        System.out.println(this);
    }
}
public static void main(String[] args) {
    //Create an ApplicationContext
    AnnotationConfigApplicationContext annotationConfigApplicationContext = new AnnotationConfigApplicationContext();
    //Register our custom Scope
    annotationConfigApplicationContext.getBeanFactory().registerScope("testScope", new TestScope());
    //Register the configuration Bean we need
    annotationConfigApplicationContext.register(Config.class);
    //Call refresh to initialize ApplicationContext
    annotationConfigApplicationContext.refresh();
    //Get Config Bean
    Config config = annotationConfigApplicationContext.getBean(Config.class);
    //Call the automatically loaded Bean
    config.a.test();
    //Call getBean from BeanFactory to get A
    annotationConfigApplicationContext.getBean(A.class).test();
    annotationConfigApplicationContext.getBean(A.class).test();
}

After executing the code, it can be seen from the plex output that these three A are different objects:

com.hopegaming.spring.cloud.parent.ScopeTest$A@5241cf67
com.hopegaming.spring.cloud.parent.ScopeTest$A@716a7124
com.hopegaming.spring.cloud.parent.ScopeTest$A@77192705

Let's modify our Bean to make it a Disposable Bean:

public static class A implements DisposableBean {
    public void test() {
        System.out.println(this);
    }

    @Override
    public void destroy() throws Exception {
        System.out.println(this + " is destroyed");
    }
}

Modify our custom Scope:

public static class TestScope implements Scope {
    private Runnable callback;
    @Override
    public Object get(String name, ObjectFactory<?> objectFactory) {
        return objectFactory.getObject();
    }

    @Override
    public Object remove(String name) {
        System.out.println(name + " is removed");
        this.callback.run();
        System.out.println("callback finished");
        return null;
    }

    @Override
    public void registerDestructionCallback(String name, Runnable callback) {
        System.out.println("registerDestructionCallback is called");
        this.callback = callback;
    }

    @Override
    public Object resolveContextualObject(String key) {
        System.out.println("resolveContextualObject is called");
        return null;
    }

    @Override
    public String getConversationId() {
        System.out.println("getConversationId is called");
        return null;
    }
}

In the test code, call destroyScopedBean to destroy the bean:

annotationConfigApplicationContext.getBeanFactory().destroyScopedBean("a");

Run the code and you can see the corresponding output:

registerDestructionCallback is called
a is removed
com.hopegaming.spring.cloud.parent.ScopeTest$A@716a7124 is destroyed
callback finished

For disposablebeans or other beans with related life cycle types, BeanFactory will pass in the operation callback required by the life cycle through registerDestructionCallback. When using BeanFactory.destroyScopedBean to destroy a Bean, the remove method of Scope will be called. When the operation is completed, we can call the callback callback to complete the Bean life cycle.

Next, we try to implement a singleton Scope in a very simple way, mainly based on ConcurrentHashMap:

public static class TestScope implements Scope {

    private final ConcurrentHashMap<String, Object> map = new ConcurrentHashMap<>();
    private final ConcurrentHashMap<String, Runnable> callback = new ConcurrentHashMap<>();

    @Override
    public Object get(String name, ObjectFactory<?> objectFactory) {
        System.out.println("get is called");
        return map.compute(name, (k, v) -> {
            if (v == null) {
                v = objectFactory.getObject();
            }
            return v;
        });
    }

    @Override
    public Object remove(String name) {
        this.map.remove(name);
        System.out.println(name + " is removed");
        this.callback.get(name).run();
        System.out.println("callback finished");
        return null;
    }

    @Override
    public void registerDestructionCallback(String name, Runnable callback) {
        System.out.println("registerDestructionCallback is called");
        this.callback.put(name, callback);
    }

    @Override
    public Object resolveContextualObject(String key) {
        return null;
    }

    @Override
    public String getConversationId() {
        return null;
    }
}

We use two concurrent hashmaps to cache the beans under this Scope and the corresponding Destroy Callback. In this implementation, it is similar to the implementation of singleton mode. Then use the following test procedure:

public static void main(String[] args) {
    AnnotationConfigApplicationContext annotationConfigApplicationContext = new AnnotationConfigApplicationContext();
    annotationConfigApplicationContext.getBeanFactory().registerScope("testScope", new TestScope());
    annotationConfigApplicationContext.register(Config.class);
    annotationConfigApplicationContext.refresh();
    Config config = annotationConfigApplicationContext.getBean(Config.class);
    config.a.test();
    annotationConfigApplicationContext.getBean(A.class).test();
    //The method name of the Bean registered in the Config class is a, so the Bean name is also a
    annotationConfigApplicationContext.getBeanFactory().destroyScopedBean("a");
    config.a.test();
    annotationConfigApplicationContext.getBean(A.class).test();
}

Before destroying the Bean, we use automatic loading and BeanFactory.getBean to request to obtain the Bean A and call the test method. Then destroy the Bean. After that, use the automatically loaded BeanFactory.getBean to request to obtain the Bean A and call the test method. It can be seen from the output that BeanFactory.getBean requests A new Bean, but the Bean that has been destroyed is still loaded automatically. So how to realize the automatic loading of new beans, that is, re injection?

This involves another configuration above the Scope annotation, that is, specifying the agent mode:

@Target({ElementType.TYPE, ElementType.METHOD})
@Retention(RetentionPolicy.RUNTIME)
@Documented
public @interface Scope {
	@AliasFor("scopeName")
	String value() default "";
	@AliasFor("value")
	String scopeName() default "";
	ScopedProxyMode proxyMode() default ScopedProxyMode.DEFAULT;
}

The third configuration, ScopedProxyMode, is to configure whether to obtain the original Bean object or the proxy Bean object when obtaining this Bean (which also affects automatic loading):

public enum ScopedProxyMode {
    //Go to the default configuration, and NO if there are NO other peripheral configurations
	DEFAULT,
    //Use original object as Bean
	NO,
    //Dynamic proxy using JDK
	INTERFACES,
    //Using CGLIB dynamic proxy
	TARGET_CLASS
}

Let's test the effect of specifying the actual object of the Scope Bean as the proxy. We modify the above test code and use CGLIB dynamic proxy. Modification code:

@Configuration
public static class Config {
    @Bean
    @org.springframework.context.annotation.Scope(value = "testScope"
            //Specifies that the proxy mode is CGLIB based
            , proxyMode = ScopedProxyMode.TARGET_CLASS
    )
    public A a() {
        return new A();
    }
    @Autowired
    private A a;
}

Write test master method:

public static void main(String[] args) {
    AnnotationConfigApplicationContext annotationConfigApplicationContext = new AnnotationConfigApplicationContext();
    annotationConfigApplicationContext.getBeanFactory().registerScope("testScope", new TestScope());
    annotationConfigApplicationContext.register(Config.class);
    annotationConfigApplicationContext.refresh();
    Config config = annotationConfigApplicationContext.getBean(Config.class);
    config.a.test();
    annotationConfigApplicationContext.getBean(A.class).test();
    //View the type of Bean instance
    System.out.println(config.a.getClass());
    System.out.println(annotationConfigApplicationContext.getBean(A.class).getClass());
    //At this time, we need to note that the name of the proxy Bean has changed and needs to be obtained through ScopedProxyUtils
    annotationConfigApplicationContext.getBeanFactory().destroyScopedBean(ScopedProxyUtils.getTargetBeanName("a"));
    config.a.test();
    annotationConfigApplicationContext.getBean(A.class).test();
}

Execute the program, and the output is:

get is called
registerDestructionCallback is called
com.hopegaming.spring.cloud.parent.ScopeTest$A@3dd69f5a
get is called
com.hopegaming.spring.cloud.parent.ScopeTest$A@3dd69f5a
class com.hopegaming.spring.cloud.parent.ScopeTest$A$$EnhancerBySpringCGLIB$$2fa625ee
class com.hopegaming.spring.cloud.parent.ScopeTest$A$$EnhancerBySpringCGLIB$$2fa625ee
scopedTarget.a is removed
com.hopegaming.spring.cloud.parent.ScopeTest$A@3dd69f5a is destroyed
callback finished
get is called
registerDestructionCallback is called
com.hopegaming.spring.cloud.parent.ScopeTest$A@3aa3193a
get is called
com.hopegaming.spring.cloud.parent.ScopeTest$A@3aa3193a

As can be seen from the output:

  • Each call to an automatically loaded Bean will call the get method of the custom Scope to retrieve the Bean
  • Each time a Bean is obtained through the BeanFactory, the get method of the custom Scope will also be called to re obtain the Bean
  • The obtained Bean instance is a CGLIB proxy object
  • After a Bean is destroyed, whether it is obtained through the BeanFactory or automatically loaded, it is a new Bean

So how does Scope implement these? Let's briefly analyze the source code

Scope rationale

If a Bean does not declare any Scope, its Scope will be assigned a singleton, that is, the default Bean is singleton. The corresponding BeanFactory needs to generate a Bean definition before registering a Bean. This default value will be assigned when defining a Bean. The corresponding source code is as follows:

AbstractBeanFactory

protected RootBeanDefinition getMergedBeanDefinition(
	String beanName, BeanDefinition bd, @Nullable BeanDefinition containingBd)
	throws BeanDefinitionStoreException {
    //Omit the source code we don't care about
	if (!StringUtils.hasLength(mbd.getScope())) {
		mbd.setScope(SCOPE_SINGLETON);
	}
	//Omit the source code we don't care about
}

Before declaring that a Bean has a special Scope, we need to define the custom Scope and register it in BeanFactory. The Scope name must be globally unique, because different scopes are distinguished by this name. Corresponding source code of registered Scope:

AbstractBeanFactory

@Override
public void registerScope(String scopeName, Scope scope) {
	Assert.notNull(scopeName, "Scope identifier must not be null");
	Assert.notNull(scope, "Scope must not be null");
	//You cannot preset scope s for singleton and prototype
	if (SCOPE_SINGLETON.equals(scopeName) || SCOPE_PROTOTYPE.equals(scopeName)) {
		throw new IllegalArgumentException("Cannot replace existing scopes 'singleton' and 'prototype'");
	}
	//Put it into the scopes map, where key is the name and value is the custom Scope
	Scope previous = this.scopes.put(scopeName, scope);
	//It can be seen that the later ones will replace the previous ones, which we should try to avoid.
	if (previous != null && previous != scope) {
		if (logger.isDebugEnabled()) {
			logger.debug("Replacing scope '" + scopeName + "' from [" + previous + "] to [" + scope + "]");
		}
	}
	else {
		if (logger.isTraceEnabled()) {
			logger.trace("Registering scope '" + scopeName + "' with implementation [" + scope + "]");
		}
	}
}

After declaring that a Bean has a special Scope, there will be special logic when obtaining the Bean. Refer to the core source code of the Bean obtained through BeanFactory:

AbstractBeanFactory

@SuppressWarnings("unchecked")
protected <T> T doGetBean(
	String name, @Nullable Class<T> requiredType, @Nullable Object[] args, boolean typeCheckOnly)
	throws BeansException {
	//Omit the source code we don't care about
	// Create Bean instance
    if (mbd.isSingleton()) {
    	//Create or return a singleton instance
    } else if (mbd.isPrototype()) {
    	//Create a new instance at a time
    } else {
    	//If you go here, it means that this Bean belongs to a custom Scope
    	String scopeName = mbd.getScope();
    	//You must have a Scope name
    	if (!StringUtils.hasLength(scopeName)) {
    		throw new IllegalStateException("No scope name defined for bean ยด" + beanName + "'");
    	}
    	//Get the corresponding Scope through the Scope name, and the custom Scope needs to be registered manually
    	Scope scope = this.scopes.get(scopeName);
    	if (scope == null) {
    		throw new IllegalStateException("No Scope registered for scope name '" + scopeName + "'");
    	}
    	try {
    		//Call the get method of the custom Scope to get the Bean
    		Object scopedInstance = scope.get(beanName, () -> {
    			//At the same time, the callback of the life cycle required to create the Bean is passed in to create the Bean
    			beforePrototypeCreation(beanName);
    			try {
    				return createBean(beanName, mbd, args);
    			}
    			finally {
    				afterPrototypeCreation(beanName);
    			}
    		});
    		beanInstance = getObjectForBeanInstance(scopedInstance, name, beanName, mbd);
    	}
    	catch (IllegalStateException ex) {
    		throw new ScopeNotActiveException(beanName, scopeName, ex);
    	}
    }
    //Omit the source code we don't care about
}

At the same time, if we define the proxy method of Scope Bean as CGLIB, when we obtain the Bean definition, we will create the Bean definition of Scope proxy according to the original Bean definition. The corresponding source code is as follows:

ScopedProxyUtils

public static BeanDefinitionHolder createScopedProxy(BeanDefinitionHolder definition,
			BeanDefinitionRegistry registry, boolean proxyTargetClass) {

    //Original target Bean name
	String originalBeanName = definition.getBeanName();
	//Get the original target Bean definition
	BeanDefinition targetDefinition = definition.getBeanDefinition();
	//Get proxy Bean name
	String targetBeanName = getTargetBeanName(originalBeanName);

    //Create a Bean of type ScopedProxyFactoryBean
	RootBeanDefinition proxyDefinition = new RootBeanDefinition(ScopedProxyFactoryBean.class);
    //According to the attributes defined by the original target Bean, configure the relevant attributes defined by the proxy Bean, and omit this part of the source code
	
    //Copy to the proxy Bean definition according to the auto load attribute of the original target Bean
	proxyDefinition.setAutowireCandidate(targetDefinition.isAutowireCandidate());
	proxyDefinition.setPrimary(targetDefinition.isPrimary());
	if (targetDefinition instanceof AbstractBeanDefinition) {
		proxyDefinition.copyQualifiersFrom((AbstractBeanDefinition) targetDefinition);
	}

	//Set the original Bean definition not to be automatically loaded and not to be Primary
	//In this way, the proxy Bean rather than the original target Bean is obtained through the BeanFactory and automatically loaded
	targetDefinition.setAutowireCandidate(false);
	targetDefinition.setPrimary(false);

	//Register the Bean with a new name
	registry.registerBeanDefinition(targetBeanName, targetDefinition);

	return new BeanDefinitionHolder(proxyDefinition, originalBeanName, definition.getAliases());
}

private static final String TARGET_NAME_PREFIX = "scopedTarget.";
//This is the tool and method to obtain the name of the proxy Bean, which is also useful when we Destroy Bean above
public static String getTargetBeanName(String originalBeanName) {
	return TARGET_NAME_PREFIX + originalBeanName;
}

What is the role of this proxy Bean? In fact, the main use is that every time any method of a Bean is called, it will get the Bean through the BeanFactory and call it. Reference source code:

Proxy class ScopedProxyFactoryBean

public class ScopedProxyFactoryBean extends ProxyConfig
		implements FactoryBean<Object>, BeanFactoryAware, AopInfrastructureBean {
    private final SimpleBeanTargetSource scopedTargetSource = new SimpleBeanTargetSource();
    //This is the actual proxy generated through SimpleBeanTargetSource. All Bean method calls will be called through this proxy
    private Object proxy;
}

SimpleBeanTargetSource is the actual proxy source. Its implementation is very simple. The core method is to use the Bean name to obtain the Bean through the BeanFactory:

public class SimpleBeanTargetSource extends AbstractBeanFactoryBasedTargetSource {
	@Override
	public Object getTarget() throws Exception {
		return getBeanFactory().getBean(getTargetBeanName());
	}
}

Get the Bean through BeanFactory. From the above source code analysis, you can know that the Bean of custom Scope will call the get method of custom Scope.

Then destroy the Bean. When BeanFactory creates this Bean object, it will call the registerDestructionCallback of the custom Scope to pass in the callback of Bean destruction:

AbstractBeanFactory

protected void registerDisposableBeanIfNecessary(String beanName, Object bean, RootBeanDefinition mbd) {
	AccessControlContext acc = (System.getSecurityManager() != null ? getAccessControlContext() : null);
	if (!mbd.isPrototype() && requiresDestruction(bean, mbd)) {
		if (mbd.isSingleton()) {
			//For singleton
			registerDisposableBean(beanName, new DisposableBeanAdapter(
					bean, beanName, mbd, getBeanPostProcessorCache().destructionAware, acc));
		}
		else {
			//For custom Scope
			Scope scope = this.scopes.get(mbd.getScope());
			if (scope == null) {
				throw new IllegalStateException("No Scope registered for scope name '" + mbd.getScope() + "'");
			}
			//Call registerDestructionCallback
			scope.registerDestructionCallback(beanName, new DisposableBeanAdapter(
					bean, beanName, mbd, getBeanPostProcessorCache().destructionAware, acc));
		}
	}
}

When we want to destroy a Scope Bean, we need to call the destroyScopedBean method of BeanFactory, which will call the remove of the custom Scope:

AbstractBeanFactory

@Override
public void destroyScopedBean(String beanName) {
	RootBeanDefinition mbd = getMergedLocalBeanDefinition(beanName);
	//Used only for custom scope beans
	if (mbd.isSingleton() || mbd.isPrototype()) {
		throw new IllegalArgumentException(
				"Bean name '" + beanName + "' does not correspond to an object in a mutable scope");
	}
	String scopeName = mbd.getScope();
	Scope scope = this.scopes.get(scopeName);
	if (scope == null) {
		throw new IllegalStateException("No Scope SPI registered for scope name '" + scopeName + "'");
	}
	//Call the remove method of the custom Scope
	Object bean = scope.remove(beanName);
	if (bean != null) {
		destroyBean(beanName, bean, mbd);
	}
}

WeChat search "my programming meow" attention to the official account, daily brush, easy to upgrade technology, and capture all kinds of offer:

Tags: Java Spring Spring Boot

Posted on Fri, 01 Oct 2021 00:30:31 -0400 by dianaqt