back to article Stale pizza, backup BlackBerrys, payroll panic: Sony Pictures mega-hack

Sony Pictures has revealed a behind-the-scenes look at how it handled its recent megabreach to select media outlets. Extensive accounts of the unfolding disaster by the New York Times, Wall Street Journal (here) and elsewhere reveal that Michael Lynton, the studio’s chief executive, communicated with other senior execs using …

  1. yoganmahew

    Integration or parallelism?

    Everything these days is integrated, instant messengers to calendars to email to intranets. While functional integration (upload/download) continues to look like a good idea, physical/logical integration (same DBs, servers, backup/replication mechanisms) begins to look like a risk?

    1. Anonymous Coward
      Anonymous Coward

      Re: Integration or parallelism?

      With pricing of MS SQL databases still in the range indicating that marketing assumes that there is a single mainframe in the cellar holding all the data of the corporation, what do you expect.

      I expect PostgreSQL, but unfortunately that choice is not always palatable to higherups

  2. thomas k.

    absence of disaster recoevery

    Good thing the hackers posted everything online, then. Sony can just download it all from the TOR sites. Might take a while to get it all sorted and re-structured, though.

    1. Chris Miller

      Re: absence of disaster recoevery

      What I don't understand, is not the absence of DR - most organisations (though Sony has been around a long time) will startup, grow and fold without ever experiencing a real 'disaster' and can therefore get away without a 'plan'. But in my experience a company as large and high-profile as Sony will experience cyber attacks (not necessarily APTs, just DDOS, boring old viruses etc etc that still have the potential for serious disruption) several times per year (if not per month). The very first time it happens, you may not have put together an incident response plan, but surely after the 20th time, even the most clueless operation will begin to think that something better than headless chicken syndrome might be a good idea? Obviously not.

  3. Yet Another Anonymous coward Silver badge

    75% of servers were destroyed

    That's because they used those special movie servers where the hacker can log in and cause them to start belching clouds of dry ice and make alarms go off before they finally explode into a shower of sparks.

    Presumably Sony's email system also had about 12 characters per screen and made teletype noises as it printed the characters one at a time.

    1. frank ly

      Re: 75% of servers were destroyed

      We'll see the dramatic reconstruction when Sony get around to making a movie about it.

      "Sony Pictures was starring in its own disaster movie ..."

      I wonder how many of the original company executives will still be around when it's released.

      1. Anonymous Coward
        Anonymous Coward

        Re: 75% of servers were destroyed

        "I wonder how many of the original company executives will still be around when it's released."

        All of them. That's part of the problem.

    2. Obitim
      Terminator

      Re: 75% of servers were destroyed

      I can't allow you to do that Dave

  4. Anonymous Coward
    Facepalm

    So...

    ...whoever it is who is currently their head of IT will soon be looking for a job, I reckon.

    NO Disaster recovery plan

    NO server redundancy

    NO backups worth a toss

    and, just to add the icing to the cake

    NO idea how it all happened? That means no security worth a toss either.

    Ye gods.

    1. Obitim

      Re: So...

      By the sounds of things, it's hubris of a massive scale, hopefully the following incumbent will have an ounce of sense about them to at least mitigate or audit what went on

    2. John Miles

      Re: whoever it is who is currently their head of IT

      I bet that won't be board or CIO level where the blame really lies

    3. Christoph

      Re: So...

      It might be that the head of IT was not allowed to implement proper security, by bosses who rubbished the idea of a company as big as Sony falling to a kid hacker.

      That won't stop the IT people taking the blame of course.

      1. Anonymous Coward
        Anonymous Coward

        Re: So...

        But at least they got pizza.

        In my former workplace, it was "Buy your pizza yourself; call me in the morning when the server has been cleaned up, I have an important presentation at 09:00. Too hard? You have only got yourself to blame. Btw. why hasn't the problem with the occasional spam in my mailbox not been fixed yet?"

        1. Trevor_Pott Gold badge

          Re: So...

          "In my former workplace, it was "Buy your pizza yourself; call me in the morning when the server has been cleaned up, I have an important presentation at 09:00. Too hard? You have only got yourself to blame. Btw. why hasn't the problem with the occasional spam in my mailbox not been fixed yet?""

          Get out of my head, Charles!

    4. Anonymous Coward
      Anonymous Coward

      Re: So...

      "...whoever it is who is currently their head of IT will soon be looking for a job, I reckon.

      NO Disaster recovery plan

      NO server redundancy

      NO backups worth a toss

      and, just to add the icing to the cake

      NO idea how it all happened? That means no security worth a toss either."

      If the theory is that someone with 'root' or other demigod access leaked login details, and had knowledge of the DR plans, it's not impossible that the DR sites were deliberately FUBARd before the last wipe, too.

      Puts a new light on your DR plans though - do you have seperate, secure logins for the DR ops that your main senior sysadmin team don't get access to? Who checks it's working? Are the logion credentials changed each time? Etc.

      Irrelevant to me on a day to day business, but definitely things I'll be looking at with my employers to see how we could survive a god-level privelige leak.

      1. yoganmahew

        Re: So...

        "Puts a new light on your DR plans though - do you have seperate, secure logins for the DR ops that your main senior sysadmin team don't get access to? Who checks it's working? Are the logion credentials changed each time? Etc."

        Physical access is yer only man. There should be certain things that can only be done on site. Most companies won't like this if they are a 24 hour operation, as it at least implies a monkey available to type (and retype) commands fed to them over the phone.

        This dinosaur never ceases to be amazed at the damage companies allow to be inflicted on them remotely.

      2. Anonymous Coward
        IT Angle

        Re: So...

        @AC

        Maybe Sony did get screwed by someone with root access (though I heard something about unencrypted spreadsheets of passwords that were in easily breached locations), but that still makes Sony a poster child for more robust DR.

      3. Anonymous Coward
        FAIL

        Re: So...

        Not so much hubris, that requries at least a modicum of competence, I would have thought. In this case, someone screwed the pooch so massivly, it's amazing the SPCA weren't alerted.

        At the very LEAST, for their DR, they should have had a complete OFFLINE server backup, complete with the OS loaded and configured, with all software installed and ready to go, but less populated data, ready to reload to all the TARFUd servers; this should be their first level backup, which should be regularly updated with all updates, patches, and reconfigurations. Call that the weekely configuration backup. You do not, EVER, hold that online: It's kept unplugged in a fireproof safe, and a VERY limited number of second or even third-line managers have access to those - or should, anyhow.

        Next up, each server has a daily offline backup performed, which is, once completed, again kept unplugged and offline, in a fireproof safe. Time consuming, yes, but if done after the non-IT workforce has bogged off home at the end of their 9-5 working day, doable. Again, access to the finished product should be limited, mostly to first or second-line managers, as it's the sort of thing that'll be needed should servers fall over for whatever reason.

        Those are the very MINIMUMS that any commercial entity should be looking at.

        Security constraints should have critical (e.g. root access) system UN/PW combos changing at least weekly (in a place I used to work, in a VERY paranoid environment, these were changed DAILY, and boy was that ever a pain), and they should be of the HIGH strength combo and length quality. OK, UNs are guessable, but HSPWs should be impossible to guess, and at least a formidable roll versus sanity to remember - say 12 digits long, mixed upper and lower, plus numerals, and case-sensitive. Yes, it's a pain in the arse, but for a company like Sony, it should be de rigueur.

        Frankly, this is so bloody basic a set of disaster and security requirements that should be implemented as standard, that it beggars belief that they apparently failed to perform even this level of protection for themselves.

        As to budget cuts and refusals, yes, these happen, but not, I would have thought, to quite that degree.

        All things therefore being considered, I'd say this was an epic-level pooch screwing.

      4. Matt Bryant Silver badge
        Alert

        Re: AC Re: So...

        "....do you have seperate, secure logins for the DR ops that your main senior sysadmin team don't get access to?...." That concept is called "shadow teams" or an "exclusive rota". What you do is split your daily admin team into at least three subteams, each subteam gets superuser logins for a third of all the really important systems. Sharing logins with non-subteam members is prohibited (a fireable offence). Within the subteams you have a rota so that someone with the logins for all that subteam's systems is always working or on call. Only members of a particular subteam are allowed actual keyboard time on their systems, anyone else (even the CIO) can only ask or dictate typing tasks which have to be entered and actioned by the subteam member. That means, in the case of someone's login being stolen (or fished) then only a third of your critical systems are at risk and control of the remainder should give you the majority needed to rebuild or repurpose. The problem is shadow teams are expensive on the admin and require teams which follow the rules, which is why Sony probably cut a few corners.

    5. Captain DaFt

      Re: So...

      "...whoever it is who is currently their head of IT will soon be looking for a job, I reckon."

      Or verry quietly about to get a big raise and promotion to keep him from going public with the emails of his own system:

      Application for Disaster recovery plan:

      DENIED! Too costly.

      Funds to ensure server redundancy:

      DENIED! Too costly.

      Funds to update backups and ensure they're viable:

      DENIED! Too costly.

      Funds and personnel to do security checks/hardening:

      DENIED! Too costly.

      1. Eddy Ito

        Re: So...

        @Captain DaFt,

        Sounds like a place I worked and eventually it drove the head of IT to seek greener and saner pastures elsewhere. Before he left he changed all the admin/root passwords hoping all the higher ups would clearly understand why he was leaving but most of them thought it was funny. The password was Dilbert.

    6. chivo243 Silver badge
      Pint

      Re: So...

      @RogerStenning

      too many yes answers to your seemingly Holmesian questions for a multinational company??? I think the ball went the other way....

      The only excuse is/will be - I was off that day! or it was my first day?!

      HIs next question will be:

      Can I repair your mobile phone?

      Live in the snake pit, die in the snake pit :-}

      Pint for the New Year'!

    7. MAH

      Re: So...

      Lets be fair...in most disaster recovery environments, the cirtical servrs are at least partially online (Os level in many cases waiting to mount the data or in exchange and AD case, online all the time) and if it was an internal attack with someone with administrative credentials and a virus that spread using smb sharing to the admin$ then your DR servers are not going to help you since they got hit as well. Obviously snapshots and backups are available to get the data, but you probably then need to rebuild the base OS of the exchange servers to get them back online since all nodes would be down...not hard..but not fun either.

      guessing the 25% were linux systems since they wouldn't have been impacted by this particular virus.

      Virtualization snapshots would probably allow them to bring alot of other systems online but I assume they needed to be sure the virus was dead before they could bring them online again or they'd be playing whack a mole as they went offline again.

    8. LinuxGuy

      Re: So...

      Sony like other major media companies I work with, might have some security flaws, but it is difficult to activate a Disaster recovery plan when both your Production and D/R sites which Sony has in different states have 75% of their systems wiped clean and the other 25% are questionable for use since they may have been injected with some sort of malicious code including their tape libraries.

    9. skeptical i

      Re: So...

      Hi, Roger: Yes to the first three without question, but reserve the possibility on the fourth that Sony do know wha' happen but are waiting for enough time to pass before releasing their "final analysis" which will have only passing resemblance to the truth. Meantime, hush money and quiet resignations will have changed hands and the tell-all book is still a dozen years out.

      1. Anonymous Coward
        Trollface

        Re: So...

        Hiya, Skeptical i :-)

        heh. Makes you wonder if the resulting report'll be the basis for a remake of "Hackers" ;-)

    10. Amorous Cowherder
      Facepalm

      Sounds like a bad case of S.E.P. at work!

      (S)omebody (E)lses (P)roblem...

      Four people named Everybody, Somebody, Anybody and Nobody.

      There was an important job to be done and Everybody was sure that Somebody would do it. Anybody could have done it, but Nobody did it. Somebody got angry about that, because it was Everybody's job. Everybody thought Anybody could do it, but Nobody realized that Everybody wouldn't do it.

      It ended up that Everybody blamed Somebody when Nobody did what Anybody could have done.

    11. Colm28

      Re: So...

      This one looks good... http://scajobs.sony.com/careers/job_detail.asp?JobID=4956568

  5. Christoph

    Cheques

    "old machines that allowed them to issue physical payroll cheques"

    Aren't they lucky that the banks weren't allowed to close down the cheque system?

    However flashy and clever an electronic system is, you still need a physical fallback when it goes toes up.

    1. Yet Another Anonymous coward Silver badge

      Re: Cheques

      There is always cash - I believe the US govt has been printing quite a lot of that recently

      1. Destroy All Monsters Silver badge
        Big Brother

        Re: Cheques

        ...but if you carry it around in physical form, you are a suspect and police can actually seize it with no particular justification to finance their year's end partygear.

        This is not a joke.

  6. Destroy All Monsters Silver badge
    Devil

    ...including "IT MUST BE THE NORKS" FBI investigators...

    The timeless movie quote

    "You asked for a miracle. I give you the EFF BEE EYE"

    comes to mind.

  7. Paul Crawford Silver badge

    Example to us all

    Sony, up there with Gerald Ratner in the annals of business acumen!

    Can't say I feel sorry for the board/corporate ethos at all, but it is pretty shitty for all of the ordinary folk who work/worked there.

  8. Anonymous Coward
    Anonymous Coward

    "Despite the extreme disruption ... was viewed as nothing more severe than a colossal annoyance”

    Exactly. Until it was made public, and only then it becomes a "real disaster". It's still incredible how many executives still believe they can amass huge pile of dust (and something more brownish...) under the carpet unnoticed as long as it doesn't becomes a "PR disaster", and only then they acknowledge "we have a problem". They are so obsessed with "PR" they can't really assess risks properly - the only risk is something becoming public and putting their comfy chair at risk. As long as they can hide it, and put the blame on someone else, internally, everything is OK - even if huge damages are the consequences.

  9. Anonymous Coward
    Anonymous Coward

    There was seemingly no plan B

    one word too many, it should go like this: "There was no plan B"

    1. Eddy Ito
      IT Angle

      Re: There was seemingly no plan B

      I'm thinking plan A was just le fromage de tête saying "wing it, we'll play it by ear".

      Icon because that seemed to be Sony's attitude.

  10. Where not exists

    What execs fail to realize

    Is that today every company is an IT company, no matter what your business is. That's the lesson for Sony here, and anyone else who thinks only of the business product, and not the underpinnings of their whole organization.

  11. Fatman
    FAIL

    Sony Pictures "hack"

    Sony Pictures seemingly lacked anything approaching an adequate disaster recovery plan or any incident response capability. There was seemingly no plan B to switch operations to another location in extreme situations. And where were the several backups or backup systems of any kind? The studio is sadly destined to be a case study in what can happen in the absence of disaster recovery and incident response for years to come.

    Can be summed up in just TWO WORDS: DAMAGEMENT FAILURE

    If this were my WROK PALCE1, executive ass would be lined up for a Trebuchet ride2.

    1 You must be a reader of CW's Shark Tank to get that reference.

    2 The Trebuchet is how our executive management sends off deserving employees on a new career trajectory. Unfortunately, the prime landing spot happens to be a "cactus patch". The workers are readying it for the Year End Separation event (i.e. those employees whose employment contracts will not be renewed, and expire at midnight).

    1. Anonymous Coward
      Anonymous Coward

      Re: Sony Pictures "hack"

      You seem to not live in a country that equates "employment" with "exploitation" that demands suitable compensation of the victim, then?

    2. LinuxGuy

      Re: Sony Pictures "hack"

      Sadly you comment on something you know nothing about the true facts.

      1. Jamie Jones Silver badge
        Headmaster

        Re: Sony Pictures "hack"

        ...as opposed to the false facts?

  12. Anonymous Coward
    Anonymous Coward

    Sony

    Lol.

  13. Mitoo Bobsworth
    Trollface

    Sony Pictures - too big to ...

    ... oops - quick, blame everyone else!

  14. Crazy Operations Guy

    $65 Million to make a shitty re-make of Annie

    Yet no budget to put proper DR in place... Hell, they could even go the cheap route and keep a bunch of cloud instances on hand to test their backup procedures and just spin them up full time in a major disaster.

    New idea for a server company:

    with each physical server sold, you'd get a free replica cloud server that will run in its place in case of a failure. Could be set up so that the server would be backed-up to that cloud server as a VM image and run it like Amazon's cheap instances where you get whatever free time is available and offer more expensive plans for 1:1 dedicated systems. Such a system could also be offered as a basic off-site backup storage and verification test. Maybe add some logic to the BMC on the servers, and with a proper VPN set up, would allow any server on the internal network to fail, the cloud service would spring into action and the end-user wouldn't notice a thing except a slightly higher latency.

    1. Dan 55 Silver badge

      Re: $65 Million to make a shitty re-make of Annie

      Well you know how loath these corporations are to bring in money from the Caymen Islands without passing it through four shell companies first so they can maintain the illusion on paper that their profits are one dollar a year. Imagine spending real money on actual physical backup servers.

  15. Bob Dole (tm)

    Here are a couple possibilities:

    Sony play station network was hacked awhile back. Someone, at that time, warned sony internally prior to the back about the need for proper precautions, but were ignored. Afterwards they brought up the need for better security and it was still not fixed.

    Eventually, said person has had enough and decides a bigger lesson is needed...

    OR

    A competitor hires someone to spy on sony by getting a job in their IT department. Competitor decides its time to just take them out and the spy flips the switch.

  16. SecretBatcave

    Hmmm

    I work in VFX, so its a bit funny to hear sony bleat on about security. The consensus here is that its an inside job. The person that did this *hated* sony. To me it sounds like someone wanted to bring the house down.

    But the Nub on the matter is this: Sony appears to have failed to follow its own advice for security. When a VFX house applies to work on certain shows, they have to be audited to make sure that no footage will leak. Since Expendables 3 leaked (which couldn't have come from a post house, as its the full movie, with sound. Something none of us have) They've gone super Nazi on the requirements. Segregated data and management networks Airgaps between the internet and internal networks. All data in and out of the building must be moved by hand. All USB/DVDs disabled.

    All internet access is done via terminal services. We had to battle to allow copy and paste...

    And yet depending on the narrative you subscribe to, either someone stole HR/email backups/restricted file services via USB, or malware.

    Either way should be impossible if they'd implemented their own guide.

    This ofcourse assumes that it wasn't a rouge sysadmin. From the noise I've heard about the malware, it uses brute force to guess passwords. Do they not have account lockouts? (another requirement...)

    Either way, they couldn't have given a shit about security, well not in any meaningful way. Judging by some of the chacaters I've met from that neck of the woods I can imagine that the higher ups were extremely resistant to even the most basic of security measure.

    From what I understand they had byzantine VPN authentication, but yet people appear to have been able to gain access to the email server/backups

  17. Anonymous Coward
    Anonymous Coward

    I keep hearing about backups in the comments

    But can you trust them?

    Part of the issue in the cleanup will be what can you trust? How long were you really rooted given that it appears on the surface that legitimate accounts were used and there is no guarantee that ghost accounts/long dormant accounts created for the purpose of the attack were not in place for sometime. Does anyone have any idea of what is normal behavior? How up to date are any of the hard copies of the DR plan if there is one?

    Having dealt with this sort of thing (hence anon post) it becomes a matter of trust which in this case there really is none left. Recovery is seemingly impossible when you don't know what you can recover. It will take quite a bit of time to review logs and piece things back together. Not something you can do over a weekend even if you have security in place. As one person alluded to there must be some level of security knowledge based upon defenses in place to prevent leakage of films. So most likely there were varying levels of security across the company. I am sure there are things in place in an organization of this size that were/are policy controls versus technical controls and this allowed boundaries to be crossed.

    I am not in any way defending Sony but pointing out one of the problems of disaster recovery. Just because you have backups and they work or backup sites that work does not mean you can just get right back to business as some of the commenters here seem to imply.

    1. Vic

      Re: I keep hearing about backups in the comments

      Part of the issue in the cleanup will be what can you trust? How long were you really rooted

      This is why you segregate data from OS.

      It's a simple enough matter to rebuild servers with the automated deployment system[1] you keep offline and powered down except for such difficulties. The passwords will be a little stale, but that;s not the end of the world. At that point, you have a blank server that will do what is needed.

      All you then need do is to apply the relevant data - which, being non-executable, doesn't complain any root exploits.

      This doesn't fix any directory services you might have running - they're probably toast, and need to be rebuilt from whatever you can find - although it appears that both OpenLDAP and Active Directory can both export to XML, so your backup could help there, even if it does require manual inspection before restore.

      But what you really need is a management structure that gives a flying fuck about DR. And they're remarkably thin on the ground[2].

      Vic.

      [1] I use Cobbler and Puppet for this sort of thing, but there are many options.

      [2] Many a time and oft I've been called in[3] to fix massive data loss. Invariably, someone on-site has bitched about proper backup in the past, but been ignored because it's not a problem management had encountered before, so they don't believe it will happen to them...

      [3] There are a number of people in the area purporting to do the same as I do, but for sigificantly less money. I get called in - often by my competitors - once they've failed.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon