Monday, June 20, 2011

Get with times, decentralized security is so 2000 and late

You would think we would have matured enough as a security industry that there would be a consensus on this topic. However we are not even close, mainly due to bureaucracy and politics. So lets survey the land of failed justifications.

"Were so big we have to be decentralized"

There is nothing that states centralized security means physical separation. You can have people local to your sites all over the world and still report into a single organization.

"Our business unit is so different we need our own team"

This argument often can be valid for IT services which require customization and agility. This is rarely the case for security. Just because a particular business may require a different policy or higher standards doesn't mean they should be rogue. The overall marching orders need to be coordinated otherwise you end up having gaps in visibility, protection, compliance, etc.

"This is the way we have always done it here"

This is by far the weakest thing I've ever heard. I almost think its purely a justification to hand out C-level titles. News flash, if your organization has more then 1 CISO your probably not that good at IT or Security. You have to ask yourself are they even qualified for that position or do you have a bunch of climbers looking for a security bullet point in their resume.

Now I'm not completely blind to the fact the separation is often done for real reasons, unlike the horrible ones given above. Legal restrictions sometimes may prevent data from leaving a particular country or mandating particular requirements. However I'm not aware of any law anywhere stating that your IT security goals and objectives can't come from a centralized structure. If there is one, please provide me with the source. Another valid reason that often arises is due to mergers and acquisitions. Its quite common due to being a new acquisition, that an organization may not be fully integrated yet. Or even the case that strategically you want to keep it separate so you can divest it much quicker.

For me though, its important to understand that your entire organization is fighting the adversary together. You fail and succeed as an entire company, not as a business unit. While an enclave or silo may have world class security practices, they are only as strong as the weakest link. At some point there is a trusted process or network connection for another unit that may not have such good security. This doesn't mean that all security personnel need to be located at the corporate mothership. It simply means you need a common understanding of how to handle security incidents, architect your network and implement better security controls. If you look around and you see a lot of dotted lines and CISOs on your org chart, that's a pretty good sign that your security efforts are disjointed, taking on too much, and doing nothing really well.

Wednesday, June 1, 2011

Shooting Blanks FTL

How many times in your career have your heard there are no silver bullets? I'm sure its been quite a few times and then some. It definitely needs to be apart of your infosec mantra to ensure people don't have a false sense of security. It should be well ingrained that [AV, FIREWALLS, IPS, PROXIES, *] don't stop sophisticated attackers. They are at best a speed bump in the road.

So what is the point of this post? I've noticed a disturbing trend in the industry of knowledgedable individuals going to the opposite of the spectrum. Instead of taking a practical approach they shoot down any security control based on its flaws. One of my favorite quotes illustrates this perfectly.

Narrator: Tyler, you are by far the most interesting single-serving friend I've ever met... see I have this thing: everything on a plane is single-serving...
Tyler Durden: Oh I get it, it's very clever.
Narrator: Thank you.
Tyler Durden: How's that working out for you?
Narrator: What?
Tyler Durden: Being clever.
Narrator: Great.
Tyler Durden: Keep it up then... Right up.

Some people are just a little too clever for their own good. They routinely dismiss proposed security solutions as having flaws and not worth pursuing. News flash, short of unpluging the power or pulling the network cable, all solutions have vulnerabilities to a certain degree. Doing nothing isn't an option. Accepting the status quo is a defeatist attitude in this little thing we call "cyber conflict". Yes thats right I used the word cyber, deal with it. APT in your house stealing your stuff. Ask yourself this, do you go to a gun fight with a knife? No, you want a gun preferably with some ammunition. In this case, the ammunition is your defense in depth. Yes it most notably depends on people and process, but security tools play a big factor. While in this allegorical gunfight the adversary has an AK-47 with a banana clip, you should at least show up with a Glock-22 loaded with a few rounds of .40 S&W. Yes more times then not we will lose, but making the adversary duck, dodge, displace, and slow down is worth the effort. Who knows you might even win some of those battles and eject them from your network like spent cartridge.

Thursday, May 19, 2011

CEIC 2011 Recap

After leaving a cold and rainy 50 degrees and arriving in Orlando to a warm, sunny 80 degrees, I was immediately in a better mood. The Royal Pacific venue is awesome. It's located at Universal Studios, has nice rooms and great restaurants. Registration was quick and painless with no long DefCon style lines. I was surprised a bit though that 1100 people were here as I thought the con would be a little smaller. However it doesn't feel as crowded as some others I've been to. They did mention that the amount of attendees has doubled since 2009.

I first attended an Encase Forensic v7 Preview workshop to outline what is being released in June. They have FINALLY added true multi-core, multi-threading to take advantage of good hardware. Some highlights include all modules like the ProTools Suite are now included in the base product and more noteworthy native processing for iOS, RIM, Android, and WinPhone6. There is also a new evidence format (EX01) and shiny new frontend for opening cases and adding evidence. The new image format also supports AES256 encryption, so the use of encrypted hard drives may be a thing of the past. The case processor has now been integrated and allows for templates to be created scripting much of what you want to preprocess like mounting compound files. They are also breaking out a new product called Evidence Processor which will allow you to distribute the load to multiple machines and merge them back into a single case. Overall it looks to be a winner and should help them compete better against FTK.

Next I attended "Memory Analysis and Malware Triage" by David Nardoni and two guys from General Dynamics. This was a pretty basic presentation probably more worthwhile 3 years ago or to someone who had never done memory analysis. It included a lab using Memoryze analyzing an rbot sample. They covered the key indicators to look for in a memory capture and what they can reveal. They also mentioned a tool called FingerPrint from HBGary.

To wrap up Day 1, I attended “What’s new in Windows Forensics” by John Marsh. This was mostly a review of what has been out for awhile now. Since I don't examine Win7 and 2008 systems on a regular basis, most of it isn't applicable for me until it becomes more mainstream in the corporate environment. There was the usual stuff on mining UsrJrnl and TxF transaction journals. He also mentioned that last access times are disabled by default now. The most interesting segment was on the registry. You now have to check for two different registries based on privilege level to capture all the details (UsrClass.dat). There are also transaction logs for the registry which will be huge for malware investigations. Some of the attendees from the UK also talked about a Woanware which has some nice tools. Finally we covered mounting and sharing out the volume shadow copy using vss admin. VSS makes a restore point every 7 days, prior to patching, and whenever it installs an unsigned application.

This was followed by a nice welcome reception by Guidance at the lagoon with food and drinks.

Day number 2 started off with a great keynote by Eric O’Neil. As a fan of the movie Breach, I was thrilled to see this talk. He talked a lot about his experiences busting Robert Hansen, which was awesome to hear first hand anecdotal stories. He mentioned about how scared he really was when he stole Hansen's palm pilot and had to sprint back to the room because forensics took too long imaging it along with the memory card and he couldn't figure out which bag pocket he took it out of originally. He also said the best thing he ever learned from Hansen was on day one. He said the spy is always in the worst position. This means he is always the one who will suffer the consequences if caught and constantly looking over his shoulder. O’Neil also believed that while Hansen may have started spying originally to make money for his family, he wasn't greedy and told the Russians to stop giving him so much money and keep it under 10K per drop. He ultimately thought Hansen kept spying because the Russians made him feel like he mattered and was a success while he was loathed by his peers at the FBI. Aside from Hansen, he also touched on some other interesting topics. He said while on travel he always puts up the do not disturb sign on his room and then sets traps in the room. Of course, some one always enters his room looking for "stuff". He also covered some common things he is seeing on the many corporate espionage cases he has worked. From dumpster diving and posing as contracted shredding companies to well-placed interns and phony shell companies the environment is ruthless. He reiterated an idea all should be familiar with. If you have something cool, somebody wants to steal it.

Next I attended "Android? Encase Does.." by Andy Spruill. I liked this lab because we got to walk though analyzing evidence files from a Sprint Evo 4g. So Android is leveraging YAFFS with a FAT formatted sdcard typically. Google has pushed hard for developers to always write their application data to the sdcard, however this isn't always the case. The two options for acquiring included rooting the phone and usb to usb debugging. The former allows you to see much more of the file system, however its way more intrusive. The main location for application data is in /Android/data. You should always process the sdcard first as many tools accessing the built-in flash will modify the timestamps on the sdcard. Once processed you can export location data to a KML file and view it in Google Earth for an awesome tracking visual. Its good to become familiar with SQLite and SQLiteBrowser as all the applications use it. Also of interest, the navigation app records the turn by turn direction as wav files that can be retrieved to show where the target may have been driving at a certain time. Spruill suggest that your practice rooting Android phones as it is quickly becoming an essential skill.

From there I listened to Rob Lee's Super Timeline presentation. The session basically walked through building an accurate timeline using SIFT tools (regtime, fls, log2timeline, etc). He noted that FAT will stay in local time regardless of what time zone you are in. He also mentioned that NTFS will keep from 8-12 timestamps (STDInfo, FNInfo, SFNInfo). MFT Examiner a tool from the UK. He also mentioned that while it still has lots of legitimate hits, looking for all zeroes in the nanoseconds field is a decent indicator of timestomping manipulation.

At this point, I couldn't fathom sitting through an Enscripting 101 session, so I got on the waitlist for “Revealing Intent with Windows 7 Artifacts” by Alissa Torres from Northrop Grumman. She was a great presenter and had people engaged the whole time with her HappyCubes. There are two types of Win7 jumplists: AutoDestination for Users and CustomDestination for Apps. You can mount the compressed files to gain further details. The .search-ms connectors have lots of metadata and can be exported as xml to find more user activity. Federated searches(.osdx) is also a new feature in Win7 that allows you to search a bunch of predefined sources including websites and network shares. Libraries are another artifact which is groups of files from different locations kept in a single container. StickyNotes also can contain some user attribution. TZworks and Nirsoft provide good shell bag parsers. Yaru is a nice tool for finding deleted regkeys. If you delete a directory in Win7, there will only be an $I file, not an $R file. DMThumbs is a good parser for the new thumbs.db format in Win7.

After this there was a cool happy hour in the Exhibit hall followed by a great party by Mandiant at the Wantilan Luau. They gave out t-shirts and had an awesome open bar.

I kicked off Day 3 with Simon Key's presentation “File Identification and Recovery Using Block-Based Hash Analysis”. I must confess I was not properly caffeinated so it took me awhile to get into this. I originally learned about this about 3 years ago when attending training by Guidance. Simon has made tremendous improvements in the quality and usability of the enscript. It’s help function has a nice explanation of how to use it. First it’s a good idea to close all your mounted compound files as that may give you errors when running the enscript. If you are doing multiple files always use the hash list. He also mentioned a common mistake is to think it’s found parts of your file when it's only sectors of all x00s or xFFs. The intelligent tail analysis function does take a lot of time, but it helps you when the last block of your file is missing and you don’t want to keep hashing the same block over and over. Simon walked us through 3 different demos which were great. VLC actually played a partial recovery with only 8% of the sectors. He also showed me a new feature of his enscript that he calls Block-Based File Identification aka FuzzyHashing aka ssdeep. If you know the structure of your file, for example Word docs are set up in 64byte blocks, you can find varied versions of the file. Make sure to check process all data with current files, so you don't waste time on deleted data. Overall I enjoyed this one quite a bit, it was a nice refresher to the subject.

Next I skipped out on the Mock Court Trial presentation to get into Rob Lee's session on web browser analysis. I'm glad I did, it was packed as usual. There was a heavy focus on the new stuff in IE8/9 and then at the very end on Firefox. Most of the material is stuff from his SANS 408 course. First off, when you see file:// in the index.dat file, it doesn't necessarily mean it was opened in the browser, but more likely through local file clicking. There is no such thing as last-modified in the index.dat so if you see that they don't match is most likely means your tool isn't functioning correctly. WebHistorian and NetAnalysis have been updated to fix this. DOM storage is a great place to look as most of the app preferences are stored here. Session recovery also has some very good evidence like clear text passwords; however there isn't a lot of automated parsing yet. MiTech makes his recommended Structured Storage Viewer. Suggestsites.dat also can give you a clue as to what the suspect was doing even if they have cleared their browsing history. Internet Evidence Finder(IEF) is another favorite tool for carving evidence out of memory and disk, however the timestamps aren't found for memory. It's worth noting that the pagefile can often move artifacts back into ram after reboots. The infamous Chewbacca defense is often used to debunk evidence by saying they have to prove something isn't possible. Flash Cookies have become huge over the last few years as they don't expire, are browser independent, and aren't cleared automatically. Rob recommended reading the WSJ article series on web privacy. A nice trick for recovering files is to make a file with the same name and in the exact location and then use the recover last version function to restore from VSS. The Firefox sessionstore.js is in clear text and you can use FirefoxSessionStoreExtractor from woanware to parse. He believes the privacy mode in Firefox is superior to IE as it overwrites instead of deleting. Another cool indicator is when an exactly an hour of history is missing showing they used the clear last hour option.

After a steak lunch, most of the attendees were in a food coma for Litchfield's session “Database Breach Investigations Made Practical”. He is really one of the few who are creating DB specific tools in this sector which is awesome. Apparently he doesn't have to work anymore either since he sold his company back in 2007, but he is still giving back to the community. He said that Oracle is harder to triage because there are better native tools for MSSQL and MySQL. He started off by outlining all the different db artifacts that can be used. He mentioned that if you ever see the Java Wrapper Class in the DB ObjCode, which is typically in ram, this is a sure sign of an intrusion because the code isn't used anymore. LogMiner was a tool he recommended. BlockSize for Oracle is 8192, ID 10 is the user table, and ID 18 is the obj table. Also checkout databasesecurity.com. Some of his standalone cmd line tools include filter, dumpaction, and orablock. Another GUI tool is DataBlockExaminer for Oracle, which will show you deleted rows in red.

To wrap up Day 3, I went to "The Art of Mobile device Malware and How to Detect and Defend Against it" by Roy Hu. Who knew but apparently Accenture has some good talent in the mobile security space. They are seeing quite a bit of non-targeted information stealers and banking targeted malware. They also have found that while remote wipe is recommended it often leaves artifacts behind on the phone. They expect to see Near Field Communication (NFC) take off more in the US the way it has in Asia and the EU. They mentioned briefly that the mobile variant of Zeus dubbed Zitmo. The second half of the presentation was a techincal dive on DroidDream, given that name because it was only active a night when the owner of the phone was most likely asleep and charging their phone. DroidDream used XOR encryption and leveraged Exploid for 2.1 or less and RageAgainstTheCage for 2.2 or less to root the phone. This has all been patched in Android 2.3. They mentioned that you should use Lookout AV for your Android phone, however there are also trojaned clones of it so beware. Also, Lookout had a nice presentation at DefCon18 which is recommended. One of their favorite Mobile Device Management (MDM) Tools is by Good Technology because it actually uses its own encryption and separates out corporate data and personal data. I spoke with them afterwards and said there wasn't any good anomaly detection today for malware on cell phones that they are aware of and your basically stuck reviewing logs of installed apps and having to compare that to osint feeds. I asked them specifically about malware targeting specific companies and they didn't have any examples of that.

On Wednesday morning, I attended “iOS Forensics and Encase” by Sean Morrissey. He recommends having a small charger for use inside a faraday bag to extend battery life and avoid the phone locking. He said its best practice to use a MacOSX workstaton and its native tools for analysis. He likes the PList Editor from the development tools, however you need the XCode3 instead of the newer XCode4, which removed some functionality. MacForensicLab is one of his favorite data carvers. He also likes the iPod Robot Plist editor for windows platforms. He said the forensic community has known about the GPS log data that Apple kept since iOS3, but kept quite on it to avoid notice. Since it went public, what use to provide up to a year of GPS data, is now going to be only 7 days and probably encrypted with iOS5. Another favorite is MSAB XRY, which does a complete physical dump. And also ZRT for doing automated screenshots. He thinks about 80% of what you need you can get from logical dumps and that usually is enough. Physical dumps are going to mostly contain fragmented data that you have to manually parse out. His favorite acquisition tools are FTS iXam and AccessData's MPE+. He has verified these tools by using HFS debug and tracking the incremental writing of CatlogID's. Encase has no native HFS+ support yet, however you can use a hexeditor to change the file header from HX. You can also change a raw dd file to a dmg extension and Mac's will mount it. Apple devices always use local time and he likes TimeLord for analysis. C4All.CA has nice tools (C4M and C4P). Binary Plist Finder & Parser are also good tools. The AT&T sim card only contains the last 10 calls and provider data and it is being phased out to use built in hardware in the future. The Encase Neutrino product doesn't parse out as much data from iOS as some of the other products like CellBrite.

To close this con out I attended "Encase and Flasher Box HEX Dumping Analysis" by John Thackray. He is British born but a Kiwi by choice in his own words. When it comes to cell phones there is no one single product that is going to get you everything you want. Extraction is the proper term as a true bit-by-bit copy isn't really possible. A flasher box should be a last resort as it can sometimes destroy evidence and even brick the phone. Test devices are essential. Check out the forum phoneforensics.com. All the firmware data you want is on the chipset and has nothing to do with the sim card. Flasher boxes have their code updated frequently so you need to update weekly. The process is very fast however the data you get back is highly fragmented. Locate all the maintenance codes as they are best way to get the phone to spit out make/model/version information and also perform other options. PM Records (Permanent Memory) are 0-999. Absolute Records are the memory offsets used to create binary dumps. He really showed us how the unlocking process worked on a Nokia which was cool to see. It was basically knowing where the IMEI and Security codes are kept. To find out a new structure when a new phone comes out the process is best down with KDiff3. By using a control phone and comparing hex dumps you can track when a text is sent or a call is made and see where the changes are put. The phone numbers themselves are usually stored as a reverse nibble. Timestamps are often different for sent messages and received messages. Hexaminer is a great tool for creating searchable 7-bit hex terms from known ascii text. Next we worked through a lab on a Samsung phone and manually recovered SMS text messages from raw hex. Someone from the crowd recommended a tool called Alibi(SMS Edit) for modifying sms text messages. Thackray said TigerText is another one that sends text messages and immediately deletes them. He also gave us a tool on CD called LiveExaminer.

So now that my first CEIC is in the books I have to say I was very impressed. The venue in Orlando is awesome and everything was well run, except for a minor lighting snafu. It wasn't overcrowded like Blackhat/Defcon, the food was good, and it was easy to talk to the presenters. I think the only drawback is that a good chunk of the crowd isn't very technical, think e-Discovery legal people and new to forensics cops. I would also say it’s better to register sooner rather than later to make sure to get into your favorite sessions. In the future I would like to see a more dedicated advanced technical lab-based track for people that have been doing digital forensics for awhile. Being able to work through some evidence is much more appealing than just a pure lecture. I will definitely come back to the Orlando location on the odd years as this was a great experience for me.

Best Presenter - Rob Lee - I think he was the most polished of the speakers and talked about things I wanted to hear. You can tell he is someone who like to know why something works and not just get the output of a tool.

Most Fun Presentation - Alissa Torres - She kept the crowd smiling the entire time and her enthusiasm was infectious.

Best Presentation - John Thackray - This shed light on area of phone forensics that most people don't have a lot of experience. It delivered exactly what I was looking to hear and probably taught me the most of all the presentations.

Best Vendor - NetWitness by a nose - Mandiant had a great party, but I liked the NetWitness booth the most as they took the time to really show me the product and give me the details I was looking for in regards to capabilities and deployment scenarios. Their tools show great promise for being able to process things in bulk off the wire.

Tuesday, May 10, 2011

Containment Strategery

One of the key metrics Computer Incident Response Teams (CIRTs) often measure is time to containment. This is often seen as a way to guage the performance of the team as it tracks how long it takes to contain a compromised or infected computer from the time of reporting or detection. This number varies widely accross the companies and many simply do not have the capability or desire to record this information. I think this metric often indicates how well the CIRT team knows their environment and the maturity of their processes. So I highly recommend it be a key performance indicator in your CIRT program.

Today however I would like to specifically talk about an appropriate goal for this metric in relation to compromise by advanced external threats. So I will be excluding non-targeted malware and insider scenarios. I believe on one end of the spectrum you have teams that like to contain as soon as possible to limit any possible impact, whereas on the opposite end you have teams that like to wait a long time (weeks/months, usually contracted responders) to fully scope an incident prior to making any major containment efforts. And before we proceed further containment can mean many things, however I will define it here as isolation or removal of the compromised computer from the network. That being said, why would you choose either of those extreme options? One strategy is to quickly deny the adversary any asset before they can conduct further operations inside your network. The big pitfall being here, that you don't have enough time to figure out exactly how they compromised the system and what other systems they control in such a short time span. Whereas, waiting longer allows you to fully scope out the extent of the breach where the hope is that the investigation doesn't alert the intruders that the defenders are on to them. This routinely fails as advanced intruders, know to mix up their backdoor tools and maintain several entry and exit points. To me rather then being time focused, I prefer a process flow that scopes the incident for you.

Questions like the following are key to this flow:
What method was used to compromise the system?
How long have they been active in the environment and are they still active?
Which system was ground zero for the intrusion?
What accounts have been compromised and can they be reset in a timely manner?
What ingress and egress points are the intruders using?
What systems have been touched by the intruders?
What command and control (C2) method are the intruders using and can you decipher it?
Have you seen this group in your environment before?
Have you documented the indicators of compromise (IOCs)?
Do you have the ability to scan your environment for these IOCs?
Do you have the capability to take the system offline without a disasterous business outage?
Has the scope of the breach and/or data loss been determined?
Has senior security leadership been briefed on the incident?
Is data exfiltration actively occurring?

These are just some intial questions you need to add into your containment decision process flow. I can tell you that being on either end of the spectrum is not sucessful in large companies where you don't have good system inventory and a full internet gateway registry. It's possible to do either if you have full mastery of your computing infrastructure, but this is a rarity. I think based on your capabilities and the questions above you can create a plan that gets the system contained as quick as possible with out tipping off the intruder and/or allowing them to continue to develop their foothold on your network.

Stay secure my friends.

Friday, April 29, 2011

When to burn a Zero-Day?

So I've often heard people say "Why would you waste a Zero-day on <insert something>?". And on the opposite end of using your Zero-day, you have the hoarders who simply collect them to keep in their back pocket. So the question remains, when is the appropriate time to actually use a Zero-day for legitimate purposes?

The primary impetus for this discussion was someone smugly claiming they would never use a zero-day in a hacking competition or CTF event. So I can understand that stance, however if your trying to win something like P0wn20wn or some other serious hacking competition why wouldn't you? Is that truly a waste of a good Zero-day if it brings you respect in the industry and potentially more consulting work? I don't believe so, however financially given the cost of exploit development it may be wasteful. I think it really depends on the exploit. I've heard that security research companies often task teams of individuals for months to years just to develop a great reliable remote exploit on a popular platform or application. That isn't cheap in terms of billable hours by any means. Financially it may make sense to sell your exploit, however as a whitehat and someone who is a fan of responsible disclosure I can't agree with this line of thought. The other option may be to leverage that exploit in your pen testing engagements. So how would that benefit the customer? Yes it may give you credibility, but if they can't do anything about it patching wise, then nothing is gained. I don't buy into that approach unless you as a pen tester can recommend a solid mitigation plan for the vulnerability you've exploited.

To wrap things up, unless you are specifically tasked to research and deliver a working exploit to a customer for their use, I think it makes the most sense to just follow the responsible disclosure methods. To the contrary, if you are trying to build up your credibility and/or consulting business then it may also make sense to use them in an engagement or competition. I still do not believe the customer is looking to be exploited by a zero-day without any mitigation possibilities, unless you can show them that the exploit is already being traded in the underground. In that case, it is not really your private exploit but a legitimate attack they need to prepare for.

Friday, April 22, 2011

What scares you more: APT vs Anonymous vs Wikileaks?

So the past few years have been very interesting in IT security as the amount of public disclosures have increased exponentially. Victims like Google, RSA, HBGary, Bank of America, etc and consultants like Mandiant, McAfee, and Verizon Business have provided more details then ever about the serious threats facing the public and private sector. Its almost coming to the point of information overload, and that's even after weeding out the FUD and sales talk.

So as a security leader in your company what keeps you up at night? First lets define the three "threats" I'm detailing. Yes there are still plenty of other big time threats like organized crime, however I'm keeping the list intentionally small and current.

First you have our beloved APT. I hate this term, its been polluted by the originators of the term, by the people who should know better calling it FUD, and by the sales/marketing folks. But its what we have to work with. APT, has various goals, but the noisiest among them is theft of intellectual property. The outcome of such attacks is also varied, however in the near term it can impact business negotiations and M&A activity and in the long term it turns whatever special sauce your company has into a commodity available to other companies that can likely do it cheaper than US/EU counterparts. Of the three, this is by far the hardest to detect and respond to. It takes a strong security leader with both a short term tactical plan and a long term strategic vision to effectively mitigate this threat.

Next you have the Anonymous threat. For this discussion, just assume Anonymous = Hacktivists. The first rule of dealing with Hactivists is do not underestimate them. HBGary did and they are paying dearly. Hactivists groups are so different its hard to categorize them, however they generally target your company for its perceived policies, ethics, actions, or political stances. Like other threats this requires a comprehensive approach to hardening your network with a particular focus on email and document security. The outcome of such attacks is immediately felt, as its routinely publicized. Having a proactive communications and legal team is crucial to dealing with this threat also. While it's not always the case, acting in a transparent and ethical manner could also alleviate these fears. But that might just be too much to ask many businesses! :-)

Finally, we have johnny come lately Wikileaks and the lot. There are several Wikileaks type sites and for this discussion we can consider them the disgruntled insider threat (FYI, and before you call me out on it, I'm aware that Wikileaks stole some documents via p2p). The outcome of this attack is very similar to Hactivists in that you have an immediate public relations nightmare. Countering insider threats is extremely difficult. In basic terms you cannot not stop a skilled, privileged insider. The upside is that they are the most likely to be caught afterwards and be convicted. Companies have to use that to your advantage. Aside from the typical controls like access logging, DLP, and DRM, there is a whole set of another controls companies don't use. You should routinely communicate to employees that they are being monitored and even demonstrate this capability at internal security/it shows. Do not show them every card you have up your sleeve, however show them that the deck is stacked against them if they try to steal company data. We know this not to be the case, in terms of prevention, but the psychological effect is real.

So while I'm not going in depth on countermeasures, I've generally outlined the threats. Yes, I'm not adhering to the precise definition of threats in all cases, but you know what I mean if you are in IT security. So how do you rate them?

C-Level Executives/Upper Security Management
1 - Wikileaks
2 - Anonymous
3 - APT

CIRT/IT Security
1 - APT
2 - Wikileaks
3 - Anonymous

These are my rankings of what I think and what I believe upper management thinks. As I thought about this, it almost correlates to what causes the most discomfort for the person involved. If you are an incident responder, you don't want advanced foreign CNE actors gliding through your network undetected. If you are an executive, you don't want to do anything the will jeopardize the stock price in the near term. Every company is different, so its not a one size fits all solution. It never is. However, in my opinion taking a long term approach to the defense of your computing assets is the way to go. There are NO silver bullets. Knee jerk reactions need to be avoided to ensure they don't hurt rather then help your company. Consistent security leadership along with a c-level security advocate is beyond important.

Stay secure my friends