Curious here... What is actually happening, when these often take 6-8 hours? Where I work, we can deploy in a few minutes and have the production environment up and running, albeit not the gaming industry, probably a totally different deployment setup and i'm sure a lot less amount of code. Would be a great if a developer could chime in here
You're patching the code so it's a quick drag and drop. EQ is patching the servers. They have to wait for the glue to dry.
Well you see, time travel is not an exact science, so sometimes when they come back with the coding notes from the early 2000's, the machine places them ahead more than their estimated time allotment back into the present timeline. Then of course they have to take a couple hours to have a coffee and donut breaks, mental stability checks from time travel, convert everything to run on present day scripts, and yes it only takes a few minutes to deploy the code as per normal.
I agree; would be fun/educational to hear more about this process. One thing I learned a few years ago: the EQ servers sometimes share the same database. I'd always assumed that each EQ server had its own, distinct database. One database for Bertox, one database for Mischief, another database for Aradune, another for Vaniki, etc. Perhaps hosted on the same server/VM. But separate databases. Turns out, not always true! Multiple EQ servers can share a database together. I was rather shocked to hear this. This would be 1 reason why transferring/merging characters and accounts can be challenging. Not saying this month's maintenance actually involves that; just thought I'd share that info.
Literally thousands of interconnected virtual zone servers. If they all boot up and hit the central database at once, the central database would be overwhelmed and give timeout errors causing boot failures, so they have to slightly stagger the booting process, even a one second delay between each start would add up to 3-4 hours.
This isnt the monthly (or a) patch. Try checking these for useful information. https://forums.daybreakgames.com/eq/index.php?recent-activity/ https://forums.daybreakgames.com/eq/index.php?forums/news-and-announcements.2/
They have to give the gerbils a rest and some outside time due to a PETA lawsuit. But seriously, while there are a lot of streamlined production deployment strategies that result in uninterrupted server-side and client-side updates, streamlining EQ's deployment process is pretty far down the list of priorities. Instead it probably involves taking servers down, doing "something" to different filesystems, running some smoke tests, turning everything back on and then putting out a "Some players may be noticing an interruption in service" notices when stuff breaks.
EQ has a lot of processes and hosts involved (in the live environment). Sometimes, it does not like being shut down and restarted. Various parts of shutdown/startup can break and be difficult to start over, especially if things are in a bad state. It's definitely something that could be improved on. Being able to shut down and start the game quickly would be extremely beneficial to EQ (similarly, adding more / easier ways to hotfix things). This particular downtime was specifically for other dept maintenance that required downtime of systems that EQ relies on, meaning we also had to go down (as well as the other games). The reason the downtime was extended was due to issues with the build system itself.
Be awesome as well if they posted the patch notes....I seem to recall in the early years, the patch notes were posted right after the servers came down. Now your lucky to have them posted after servers are up (like today)..