Jump to content

MattScott

CEO
  • Content Count

    1286
  • Joined

  • Last visited

Everything posted by MattScott

  1. Hi all, We are working on the issue. Lixil has updated her original post with more information. We moved many servers last week. Unfortunately, one of them has some bad hardware and is malfunctioning. Logging in is hit or miss right now. We are hoping to fix the hardware tomorrow (5/6). Thanks, Matt
  2. Hi there, Lixil just updated her post in Social. Essentially, we have a bad piece of equipment in the new data center that is causing a problem. We have already secured a replacement part and we hope to schedule downtime tomorrow (5/6) to fix the issue. Sorry, Matt
  3. Hi Wastelanders, Servers are up. My full post about the incident is here: We will schedule Commander early this week so everyone gets notice and can enjoy it properly. Thanks, Matt
  4. Hi all, On Monday, April 29th, we scheduled downtime to physically move from one data center to a new one. This move was unavoidable based on a very expensive, legacy Reloaded contract that had ended. Even though we asked for an extension, we were forced to move out before May 1st 2019. As you know, Fallen Earth is an extremely old game. The servers run on an OS that is no longer available to download, it takes more than an hour to even reboot, and it requires many different databases that all have to be synced to operate correctly. We made backups and planned to move each system one by one into the new data center. However despite many precautions, some data was lost. The engineers have spent their waking hours attempting to find the right mix of files to get everything restored properly. We did eventually get the system back online, but it appears roughly 2 weeks of progress was lost. Having exhausted all other options, we are going to be putting the servers back online and moving forward. In the meantime, we'll be doing the following to help players recover: - We'll be giving out Commander to all players for 2 weeks to help them get caught back up - Anyone who lost purchases due to the rollback can open a trouble ticket at http://support.gamersfirst.com, and we'll escalate getting those taken care off as quickly as we can We're not going to start the Commander for another couple days, so that all the players can get up to speed on what has happened. I want everyone to be able to take full advantage of the boost over the coming weeks. Please know that the team worked very hard to get us to this point, and we are committed to getting the back end re-written so it can be properly supported in the future. EDIT: We will be turning on 2 weeks of Commander for the Fallen Earth players. EDIT: We waited an extra couple of days to try and make sure server performance was better. Effective 5/16, we have activated a 4 week Commander code as compensation to the players in the hopes of helping them catch up on lost progress. The code is: FallenNot4gotten Apologies, Matt
  5. Hi all, I think we've reached the limits of what we can do, and the down time has already been excessive. We did some tests and the newer data is too incomplete to work. With that in mind, the servers are going to be put back online shortly, and I'll make a public post about the missing data for the rest of the players. Moving forward: - We'll be giving out Commander to all players for 2 weeks to help them get caught back up - Anyone who lost purchases due to the rollback can open a trouble ticket at http://support.gamersfirst.com, and we'll escalate getting those taken care off as quickly as we can The team is committed to getting the back end re-written so it can be properly supported. Apologies, Matt
  6. Hi all, We are going to temporarily take down FE so that players don't keep playing. The hope is to quickly test a more up-to-date database and then put the game back online. Thanks, Matt
  7. Hi all, We are going to try getting our secondary environment up and running. Then we can test a new set of data. This wont be a super fast process, and many of my team of recuperating this weekend. Bear with us. My advice for current players would be to not go crazy. If we can confirm a solid way to restore the 2 weeks of missing progress we will. But that will mean losing anything since the servers came back online. Thanks, Matt
  8. Hi all, I am looking at the data issue with my team, and there is no easy answer. Right now we have two choices. 1) Leave it the way it is and work on fixing the accounts that lost paid items. We have records on the payment side, so it should be easy to restore those. The team is exhausted, but over time we can also try restoring some of the new data in a separate area which would allow us to verify some of the bigger lost progression items / in-game items in order to grant those back to players. or 2) Take the servers down and try another round of restoring data from a different set of possibly newer backups. We already know one of the databases from this newer set of backups is significantly out of sync and older. So there is risk that we will introduce a whole bunch of problems with incompatible data. Personally, as painful as it is, I'm going to recommend that we plow forward and escalate the support tickets for the players who lost real money purchases. Apologies for the issues. Sorry, Matt
  9. Hi all, We have the servers working now. We need to keep them locked for some QA to make sure everything in-game looks okay. We appreciate your patience. Thanks, Matt
  10. We can't upgrade the hardware. And the code for the servers runs on an OS that isn't available any more. For the move, we made images of everything. Backed up everything. And then moved the hardware 1 to 1, so as not to disrupt anything. Even with all of our precautions, we still have problems. My approach is to upgrade the code first and then we can update the hardware properly. That effort has been underway for quite a while now.
  11. Hi all, I can’t know how frustrating it is for everyone right now. I wish I had more details to give out. Fallen Earth is a collection of many databases and many different servers that all have to mesh together properly for us to unlock the doors and let players in. There were problems in the move and in the restore process that we are still working out. To give you an idea of how old and large the game is, it takes over and hour to full reboot everything and come back online. That means we make some changes, but then wait significant amounts of time just to see the results. Then we make more changes and then wait some more. It’s been extremely frustrating. The team is going to continue working on the servers this weekend till they are online properly. Stay tuned. Thanks, Matt
  12. Hi everyone, The team is still working through issues, but progress is being made. Thanks, Matt
  13. Hi all, We're in a weird position at this point. Technically, all the servers are working properly. All the databases are restored and up. However, the system overall is unbearably slow. It takes hours to boot and load all the data. Once it is up, the timeouts between servers make the system unstable and unplayable. We took everything down and restarted it all from scratch about an hour ago, and not everything is fully booted. I'll keep you posted. Thanks, Matt
  14. Hi all, Just posting another status. We are currently working through a database speed issue and an issue with the login server. The engineers on my end are troubleshooting. Thanks, Matt
  15. Hi all, One more brief update. We're waiting on a trouble ticket to get fulfilled with our new datacenter to fix a final hardware issue. Then we have two tasks that can be completed, and the servers can come back online. I can't predict how long the trouble ticket will take to get addressed. But I'll let everyone know once we're back online. Thanks, Matt
  16. I recommend this site: http://www.azurespeed.com/ You can see what your latency is to various parts of the country without going through our network. We did not move any of the district servers (Jericho, Citadel, or Nekrova). So your latency shouldn't be affected. However, if you find that the speed site reports no latency issues, then you can get on Jericho and run /latencytest and send it to me in a PM on the forums.
  17. Hi there, This is just temporary. We had an issue with this payment type during the move. We have reached out to them to get it fixed, and we will turn it back on as soon as possible. Thanks, Matt
  18. Hi all, Just a quick update. The team has finally worked out most of the issues (I hope). I did see the first couple servers come online internally. Don't try and log in yet, it wont work. I'm estimating about 1 more hour. Thanks, Matt
  19. Hi all, With all the craziness this week and the server outages, the development team hasn't made enough progress on the changes to RIOT for a play test this week. We're going to aim for May 10th instead. I have updated the original post to reflect the new date: Thanks, Matt
  20. Hi all, With all the craziness this week and the server outages, the development team hasn't made enough progress on the changes to RIOT for a play test this week. We're going to aim for May 10th instead. I have updated the original post to reflect the new date: Thanks, Matt
  21. Hi all, I have posted an update here. Thanks, Matt
  22. Hi everyone, All servers are back online across PC, XB1, PS4 across Jericho, Citadel, and Nekrova. I sincerely apologize to all the players for this outage. I have no interest in sugarcoating this or dodging blame. I own this failure. What appeared relatively straight forward, ended up being riddled with landmines. This network move went much too long, and our inability to properly communicate when the game would come back online was unacceptable. Needless to say, I am unhappy with how this all went, and I am sorry for our performance. I am happy to do a future post to walk through why we had to do this and what happened that led to the various issues. But I will do my best to make sure this doesn’t happen ever again, and to make sure we properly compensate the players for the downtime. Sorry, Matt
  23. Hi all, Sorry for the lack of updates. The team is working as fast as they can. We had to send one engineer to bed after nearly 3 days of mostly being awake. At this point it's a waiting game. We feel we have the problem identified, which is a routing issue inside the new data center that affects all of the servers. We're currently going box to box to fix the issue, and then we'll spin up the game. Apologies, Matt
  24. Hi all, Sorry for the lack of updates. The team is working as fast as they can. We had to send one engineer to bed after nearly 3 days of mostly being awake. At this point it's a waiting game. We feel we have the problem identified, which is a routing issue inside the new data center that affects all of the servers. We're currently going box to box to fix the issue, and then we'll spin up the game. Apologies, Matt
  25. Hi all, I am sorry for the unexpected outage. We moved some critical services from one physical location to another. Everything appeared fine yesterday. However after disconnecting the old location and turning everything off, several services hadn’t been transitioned properly causing the new servers to stop functioning properly. Unfortunately several of the engineers had already been up for 24 hours and were asleep when we noticed the issue. Everyone is back up and looking at it now. I’ll have a better update soon. Apologies, Matt
×
×
  • Create New...