L
1

Serious question, how do you guys handle a core switch going down during a big update?

Honestly, I thought our backup configs were solid until last Thursday at 3 PM. We were pushing a new firmware to our main Cisco 9300 stack and the whole thing just locked up. Took our entire floor offline for about 45 minutes. Ngl, we had to pull the power on the standby unit to force a reload from the boot flash. Has anyone else had a firmware update go that sideways on a supposedly stable platform? What's your rollback plan when the config replace command just hangs?
2 comments

Log in to join the discussion

Log In
2 Comments
morgan.joseph
Oh man, that sounds like a special kind of nightmare. My plan for a meltdown like that is basically just to panic quietly and hope the old "turn it off and on again" magic works before anyone important notices. Staging a config on a flash drive is smart, but my luck I'd trip over the console cable and unplug the whole rack.
8
chen.phoenix
You mentioned the config replace command hanging. That's the real killer. My rollback plan is to never trust a single command for that. I stage the old config on a spare flash drive as a text file and have a console cable ready to manually copy-paste line by line from a laptop if the automated process fails. It's slow, but it beats a bricked switch.
1