I am new to Kubernetes. I built three nodes and then realized that I messed up and deleted them. Now when I attempt to add new nodes, I cant because it keeps complaining about the nodes I have already deleted. When I attempt to delete them manually I keep getting:
kubectl delete node ks01
Unable to connect to the server: dial tcp 10.21.30.165:6443: connect:network is unreachable
How can forcefully delete the nodes or clean up the cluster to be able to start over?
CodePudding user response:
Try these steps
- First Mark the Node as unschedulable to prevent new pods from being assigned to it:
kubectl cordon <node_ID>
kubectl drain <node_ID>
- Then try to delete it
kubectl delete node <node_ID>
CodePudding user response:
I went ahead and removed the .kube/config file and then started over. This seemed to have worked for me.