Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

k8s statefulset config-server 部署3台, eureka 集群不正常 #2488

Closed
yan-Ops opened this issue Jul 31, 2019 · 4 comments
Closed

k8s statefulset config-server 部署3台, eureka 集群不正常 #2488

yan-Ops opened this issue Jul 31, 2019 · 4 comments
Labels
area/configservice apollo-configservice area/kubernetes kind/report-problem Categorizes issue when someone report the problem he/she meeted stale

Comments

@yan-Ops
Copy link

yan-Ops commented Jul 31, 2019

部署三台 config-server ,只有两个pod 正常,另一个pod提示端口冲突, k8s 不应该有端口冲突问题。 具体日志:

2019-07-31 18:04:00.163 WARN 47 --- [nfoReplicator-0] c.n.d.s.t.d.RetryableEurekaHttpClient : Request execution failed with message: java.net.UnknownHostException: apollo-config-server-1.apollo-meta-server
2019-07-31 18:04:00.187 INFO 47 --- [ Thread-17] o.s.c.n.e.server.EurekaServerBootstrap : isAws returned false
2019-07-31 18:04:00.188 INFO 47 --- [ Thread-17] o.s.c.n.e.server.EurekaServerBootstrap : Initialized server context
2019-07-31 18:04:00.193 ERROR 47 --- [ main] o.apache.catalina.core.StandardService : Failed to start connector [Connector[HTTP/1.1-8080]]

org.apache.catalina.LifecycleException: Failed to start component [Connector[HTTP/1.1-8080]]
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:167)
at org.apache.catalina.core.StandardService.addConnector(StandardService.java:225)
at org.springframework.boot.web.embedded.tomcat.TomcatWebServer.addPreviouslyRemovedConnectors(TomcatWebServer.java:256)
at org.springframework.boot.web.embedded.tomcat.TomcatWebServer.start(TomcatWebServer.java:198)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.startWebServer(ServletWebServerApplicationContext.java:300)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.finishRefresh(ServletWebServerApplicationContext.java:162)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:553)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:140)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:780)
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:412)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:333)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1277)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1265)
at com.ctrip.framework.apollo.configservice.ConfigServiceApplication.main(ConfigServiceApplication.java:35)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:54)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.catalina.LifecycleException: Protocol handler start failed
at org.apache.catalina.connector.Connector.startInternal(Connector.java:1020)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
... 19 common frames omitted
Caused by: java.net.BindException: Address in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.apache.tomcat.util.net.NioEndpoint.bind(NioEndpoint.java:219)
at org.apache.tomcat.util.net.AbstractEndpoint.start(AbstractEndpoint.java:1151)
at org.apache.coyote.AbstractProtocol.start(AbstractProtocol.java:591)
at org.apache.catalina.connector.Connector.startInternal(Connector.java:1018)
... 20 common frames omitted

2019-07-31 18:04:00.199 INFO 47 --- [ Thread-17] c.n.e.registry.AbstractInstanceRegistry : Registered instance APOLLO-CONFIGSERVICE/apollo-config-server-2.apollo-meta-server.config.svc.cluster.local:apollo-configservice:8080 with status UP (replication=true)
2019-07-31 18:04:00.200 INFO 47 --- [ Thread-17] c.n.e.registry.AbstractInstanceRegistry : Registered instance APOLLO-CONFIGSERVICE/apollo-config-server-0.apollo-meta-server.config.svc.cluster.local:apollo-configservice:8080 with status UP (replication=true)
2019-07-31 18:04:00.200 INFO 47 --- [ Thread-17] c.n.e.registry.AbstractInstanceRegistry : Registered instance APOLLO-ADMINSERVICE/apollo-admin-server-f455f879f-667dl:apollo-adminservice:8090 with status UP (replication=true)
2019-07-31 18:04:00.201 INFO 47 --- [ Thread-17] c.n.e.registry.AbstractInstanceRegistry : Registered instance APOLLO-ADMINSERVICE/apollo-admin-server-f455f879f-2r8s6:apollo-adminservice:8090 with status UP (replication=true)
2019-07-31 18:04:00.201 INFO 47 --- [ Thread-17] c.n.e.registry.AbstractInstanceRegistry : Registered instance APOLLO-ADMINSERVICE/apollo-admin-server-f455f879f-t956x:apollo-adminservice:8090 with status UP (replication=true)
2019-07-31 18:04:00.201 INFO 47 --- [ Thread-17] c.n.e.r.PeerAwareInstanceRegistryImpl : Got 5 instances from neighboring DS node
2019-07-31 18:04:00.201 INFO 47 --- [ Thread-17] c.n.e.r.PeerAwareInstanceRegistryImpl : Renew threshold is: 8
2019-07-31 18:04:00.203 INFO 47 --- [ Thread-17] c.n.e.r.PeerAwareInstanceRegistryImpl : Changing status to UP
2019-07-31 18:04:00.204 INFO 47 --- [ main] o.apache.catalina.core.StandardService : Stopping service [Tomcat]
2019-07-31 18:04:00.215 WARN 47 --- [ost-startStop-1] o.a.c.loader.WebappClassLoaderBase : The web application [ROOT] appears to have started a thread named [HikariPool-1 housekeeper] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread:
sun.misc.Unsafe.park(Native Method)
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1074)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
2019-07-31 18:04:00.216 WARN 47 --- [ost-startStop-1] o.a.c.loader.WebappClassLoaderBase : The web application [ROOT] appears to have started a thread named [spring.cloud.inetutils] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread:
sun.misc.Unsafe.park(Native Method)
java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039)
java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442)
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1074)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
2019-07-31 18:04:00.217 WARN 47 --- [ost-startStop-1] o.a.c.loader.WebappClassLoaderBase : The web application [ROOT] appears to have started a thread named [Apollo-ConfigRefresher-1] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread:
sun.misc.Unsafe.park(Native Method)
java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1093)
java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:809)
java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1074)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1134)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
2019-07-31 18:04:00.220 INFO 47 --- [ Thread-17] e.s.EurekaServerInitializerConfiguration : Started Eureka Server
2019-07-31 18:04:00.223 INFO 47 --- [nfoReplicator-0] c.n.d.s.t.d.RetryableEurekaHttpClient : Request execution succeeded on retry #1
2019-07-31 18:04:00.223 INFO 47 --- [nfoReplicator-0] com.netflix.discovery.DiscoveryClient : DiscoveryClient_APOLLO-CONFIGSERVICE/apollo-config-server-1.apollo-meta-server.config.svc.cluster.local:apollo-configservice:8080 - registration status: 204
2019-07-31 18:04:00.237 INFO 47 --- [ main] ConditionEvaluationReportLoggingListener :

Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.
2019-07-31 18:04:00.247 ERROR 47 --- [ main] o.s.b.d.LoggingFailureAnalysisReporter :


APPLICATION FAILED TO START


Description:

The Tomcat connector configured to listen on port 8080 failed to start. The port may already be in use or the connector may be misconfigured.

Action:

Verify the connector's configuration, identify and stop any process that's listening on port 8080, or configure this application to listen on another port.

2019-07-31 18:04:00.250 INFO 47 --- [ main] ConfigServletWebServerApplicationContext : Closing org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@5f144de3: startup date [Wed Jul 31 18:03:42 CST 2019]; parent: org.springframework.context.annotation.AnnotationConfigApplicationContext@57d46aae
2019-07-31 18:04:00.251 INFO 47 --- [ main] o.s.c.n.e.s.EurekaServiceRegistry : Unregistering application apollo-configservice with eureka with status DOWN
2019-07-31 18:04:00.251 WARN 47 --- [ main] com.netflix.discovery.DiscoveryClient : Saw local status change event StatusChangeEvent [timestamp=1564567440251, current=DOWN, previous=UP]
2019-07-31 18:04:00.251 WARN 47 --- [ main] c.n.discovery.InstanceInfoReplicator : Ignoring onDemand update due to rate limiter
2019-07-31 18:04:00.254 INFO 47 --- [ main] o.s.c.support.DefaultLifecycleProcessor : Stopping beans in phase 0
2019-07-31 18:04:00.257 INFO 47 --- [ main] c.n.eureka.DefaultEurekaServerContext : Shutting down ...
2019-07-31 18:04:00.266 INFO 47 --- [ main] c.n.eureka.DefaultEurekaServerContext : Shut down
2019-07-31 18:04:00.272 INFO 47 --- [ main] o.s.j.e.a.AnnotationMBeanExporter : Unregistering JMX-exposed beans on shutdown
2019-07-31 18:04:00.272 INFO 47 --- [ main] o.s.j.e.a.AnnotationMBeanExporter : Unregistering JMX-exposed beans
2019-07-31 18:04:00.280 WARN 47 --- [nfoReplicator-0] com.netflix.discovery.DiscoveryClient : Saw local status change event StatusChangeEvent [timestamp=1564567440280, current=UP, previous=DOWN]
2019-07-31 18:04:00.281 WARN 47 --- [nfoReplicator-0] c.n.discovery.InstanceInfoReplicator : Ignoring onDemand update due to rate limiter
2019-07-31 18:04:00.281 INFO 47 --- [nfoReplicator-0] com.netflix.discovery.DiscoveryClient : DiscoveryClient_APOLLO-CONFIGSERVICE/apollo-config-server-1.apollo-meta-server.config.svc.cluster.local:apollo-configservice:8080: registering service...
2019-07-31 18:04:00.284 INFO 47 --- [ main] j.LocalContainerEntityManagerFactoryBean : Closing JPA EntityManagerFactory for persistence unit 'default'
2019-07-31 18:04:00.286 INFO 47 --- [nfoReplicator-0] com.netflix.discovery.DiscoveryClient : DiscoveryClient_APOLLO-CONFIGSERVICE/apollo-config-server-1.apollo-meta-server.config.svc.cluster.local:apollo-configservice:8080 - registration status: 204
2019-07-31 18:04:00.287 INFO 47 --- [ main] com.netflix.discovery.DiscoveryClient : Shutting down DiscoveryClient ...
2019-07-31 18:04:00.292 WARN 47 --- [ main] .s.c.a.CommonAnnotationBeanPostProcessor : Invocation of destroy method failed on bean with name 'scopedTarget.eurekaClient': org.springframework.beans.factory.BeanCreationNotAllowedException: Error creating bean with name 'eurekaInstanceConfigBean': Singleton bean creation not allowed while singletons of this factory are in destruction (Do not request a bean from a BeanFactory in a destroy method implementation!)
2019-07-31 18:04:00.292 INFO 47 --- [ main] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Shutdown initiated...
2019-07-31 18:04:00.303 INFO 47 --- [ main] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Shutdown completed.
Exception in thread "main" java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:62)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:54)
... 1 more
Caused by: org.springframework.boot.web.embedded.tomcat.ConnectorStartFailedException: Connector configured to listen on port 8080 failed to start
at org.springframework.boot.web.embedded.tomcat.TomcatWebServer.checkThatConnectorsHaveStarted(TomcatWebServer.java:228)
at org.springframework.boot.web.embedded.tomcat.TomcatWebServer.start(TomcatWebServer.java:203)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.startWebServer(ServletWebServerApplicationContext.java:300)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.finishRefresh(ServletWebServerApplicationContext.java:162)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:553)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:140)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:780)
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:412)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:333)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1277)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1265)
at com.ctrip.framework.apollo.configservice.ConfigServiceApplication.main(ConfigServiceApplication.java:35)
... 6 more
^C

@nobodyiam
Copy link
Member

错误提示确实是java.net.BindException: Address in use,可以登上去看看端口被谁占用了

@tony-liuliu
Copy link

把你的statefulset 发出来看看,容器都是隔离的,怎么会已经使用了呢?
我部署的运行正常
image

@stale
Copy link

stale bot commented Nov 29, 2019

This issue has been automatically marked as stale because it has not had activity in the last 90 days. It will be closed in 14 days unless it is tagged "help wanted" or other activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Nov 29, 2019
@stale
Copy link

stale bot commented Dec 20, 2019

This issue has been automatically closed because it has not had activity in the last 14 days. If this issue is still valid, please ping a maintainer and ask them to label it as "help wanted". Thank you for your contributions.

@stale stale bot closed this as completed Dec 20, 2019
@Anilople Anilople added area/kubernetes kind/report-problem Categorizes issue when someone report the problem he/she meeted area/configservice apollo-configservice labels Dec 23, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/configservice apollo-configservice area/kubernetes kind/report-problem Categorizes issue when someone report the problem he/she meeted stale
Projects
None yet
Development

No branches or pull requests

4 participants