Tuesday, 14 December 2010

#PCI-DSS and #Cloud Adventures - Fun!


Media_httpfrankitlabu_wpyvq

 

As per my earlier post, I mentioned that the publication on PCI DSS reference architecture for cloud is interesting.

It is interesting because PCI DSS v2.0 (I still find it difficult to write ‘v2.0’!) just included additional guidance for virtualisation in the standard as below:

“System components” also include any virtualization components such as virtual machines, virtual switches/routers, virtual appliances, virtual applications/desktops, and hypervisors

Requirement 2.2.1 Note: Where virtualization technologies are in use, implement only one primary function per virtual system component.

 

Testing Procedure 2.2.1.b If virtualization technologies are used, verify that only one primary function is implemented per virtual system component or device.

 

So when the SSC made ‘major’ revisions to the standard and released a revised version to ensure the PCI DSS is understood clearly, I was amused and surprised at the same time that the Service Providers and some QSAs went ahead to certify or approve the relatively new technologies. The SSC has plans to work on various areas like Point-to-point encryption, Tokenisation, Mobile Payments and probably cloud technologies.

The Service Provider and The QSA

It is understandable that the vendors that got together to publish this reference architecture are keen to offer this service as a Service Provider. There is nothing wrong with this. Following this publication, on Dec 7 Amazon’s AWS cloud offering announced that they’ve achieved PCI DSS compliance as a validated Service Provider as well. A couple of good posts assessing this can be found here and here. Hence, if Merchants want it, they’ve got it.

One point I would like to highlight before moving on though is that for a vendor trying to offer such a service (Cloud-based PCI Compliant infrastructure), it is very important to get the QSA onboard early. More important is to identify that the QSA is well-versed with the infrastructure complexities of the cloud world and the underlying technologies and challenges (e.g. multi-tenancy). With AWS, they already have SAS and ISO certifications that should have made their compliance journey a bit easier. I’ve heard many experiences of QSAs not ‘understanding’ the solution they’re assessing resulting in waste.

The Merchant / The Customer

<<http://www.gillette.com/en/us/Products/Razors.aspx>>

I believe the above link is a good start to make the point. I’m sure not all of us have or use the ‘Fusion ProGlide Power Razor’ (//if reader=female then apologies). Although we’d all love to, not all men upgrade to the best Razor when released just because Gillette says it gives ‘Best Shave’ whereas the ‘M3 Power Razor’ only provides a ‘Good Shave’! It does depend on a number of variables for a Merchant or a customer to move to a cloud based offering just because it has various benefits. AWS has recently started offering a 'Free Usage Tier' to increase the cloud adoption.

And as mentioned in the linked post earlier, the Merchants require to ensure their own compliance in addition to making sure the Service Providers they contract with are PCI DSS Compliant.

The PCI SSC

The PCI DSS and other suite of standards (PA/PED DSS) have done a great job in bringing security to the forefront for a particular kind of data in the relevant organisations. There is a lot to adapt from for other sensitive data stored, processed or transmitted in organisations. Certainly, the stick behind the success of PCI DSS is the potential of fines by the card brands and the potential of direct monetary impact from data loss. However, there is a danger that the SSC will end up working on ‘additional guidance’ for new technologies unveiled by zealous and innovative vendors over the years. It is all good as long as the data i.e. the containers are secyored!

Thank you.

 

Some recent developments


Untitled

 
I like this graphic in particular that shows the disruption over the weekend in the cloud. What it does not show is the impact it had on the businesses supported by AWS in the peak business season.
 
 

Tuesday, 9 November 2010

Cloud, virtualisation and PCI DSS v2.0

For readers who keep up-to-date with the industry, virtualisation and cloud technologies need no introduction. PCI DSS v2.0 includes guidance on virtualisation compliance and is well explained and assessed here and here.


 


Cisco, HyTrust, VMware, Savvis and Coalfire have collaborated to construct a cloud reference architecture (here) that aims to address some of the unique challenges of the PCI DSS.


 


This certainly is an interesting read. I’ll post my observations in the following post, however it will be useful to know what you think of this development.


 


Thank you for reading and subscribing to this feed. Your comments are always welcome.

Thursday, 4 November 2010

Approved Scanning Vendors (ASVs - PCI DSS) in the UK

PCI DSS requirement 11.2 mandates organisations to run internal and external vulnerability scans at least quarterly and after any significant change in the network (such as new system component installations, changes in network topology, firewall rule modifications, product upgrades).


 


Testing Procedure 11.2.1c requires the assessor to validate that the scan was performed by a qualified internal resource(s) or qualified external third party, and if applicable, organizational independence of the tester exists (not required to be a QSA or ASV).


 


Approved Scanning Vendors (ASVs) are organizations that validate adherence to certain DSS requirements by performing vulnerability scans of Internet facing environments of merchants and service providers. The Council has approved more than 130 ASVs. Complete list of these ASVs can be found here.


 


I received a request recently to provide some guidance on ASVs that offer their services in the UK. Below are the companies that are listed as ASVs today on PCI SSC website that offer vulnerability scanning service in the UK. To clarify, it is not required for an ASV to be local. There are a number of other ASVs on the website that offer a similar service baselined by PCI SSC.


 


Ambersail


Context Information Security


Digital Assurance Consulting


Integralis


Matta Consulting


MWR Infosecurity


NCC Group


Nettitude


ProCheckUp


Protiviti


RandomStorm


Trustwave


Westpoint


 


Again, this listing should only be taken as a reference for organisations seeking to engage an ASV locally. Please feel free to add any ASVs that have been unintentionally missed on this list but offer the service from the UK.


 


Disclaimer: I do not work for any of the above listed ASVs and do not intend to endorse their ASV services to that of other companies offering such services globally.

Thursday, 9 September 2010

PCI DSS - Ownership and Accountability

Accountability is a problem that I come across in PCI DSS time and time again. Recent challenges with ownership and accountability prompts me to write this post.


Complex systems require a complex set of controls to ensure that these systems work as intended. These controls aim to reduce the risk of disruptions in the intended operation. However, incidents do happen and not all controls are adequate the first time they’re put forward. Hence, the controls are revised to improve the relevance and reduce the likelihood Take an aircraft for example. Or a car.


Payment Cards are just one such complex creation. In order to provide customers convenience to pay for goods, banks created this complex system with input from entities such as the Cardholder, Merchants, Service Providers, Payment Gateways, Acquiring Banks, Card Brands, Issuing Banks and various other third parties.


J0422755


In the world of digital electronics, data passes from one system to another, which is owned and/or managed by one entity to another very quickly. In the case of payment cards, this data is very important as it translates into cash/goods with relatively less effort. With the increased complexity of business delivery models and the time to deliver new services to the customer, security of this data can take a back seat and with it, does the appropriate agreements between various entities delivering this service. It is when controls put forward in PCI DSS can help in order to secure the cardholder data.


It is very important that PCI DSS requirement 12.8 is given appropriate emphasis as a key control to ensure all relationships with various entities are clearly described in a written agreement with appropriate legal teams involved. As an example, where service providers, merchants and third parties are collectively delivering a service, they should review the data flow diagram of cardholder data collectively and ensure each requirement is owned and complied by the appropriate entity. This should drill down to each requirement preferably on the assessment sheet. Merchants can have reporting structures/metrics with service providers and their third parties to ensure compliance is maintained.


In absence of such rigor around Ownership and Accountability, the entity in question eventually will have a difficult time when a major incident happens. This may end up in review of liabilities, service contracts and SLAs and may severely effect the business eventually. Most of all, it effects the user confidence. The end user empowers the payment system and trusts that the system is securely handling their data. It is very important to ensure this trust is maintained. Surely, this is how this complex system was designed or was intended to operate. 

 

Friday, 20 August 2010

Just Check In, we'll sort out the rest!

With great power comes great responsibility but I guess Zuckerberg didn’t watch SpiderMan. Turning on features without appropriately (an email would be nice!) notifying the users (unless you follow their blog) is probably becoming a norm. It is assumed that all the users will get the news. But when the change, in this case a new feature, influences 500 million + users’ privacy, I believe there has to be some sort of notification mechanism. Well, tough luck Facebook users, it is not yet to be. This particular change is not as bad for those outside the US as Facebook Places feature is only active in the US. However, each user's the profile ‘Privacy Settings’ will show three new entries/settings that default to:

Things I share:

Places I check in = ‘Friends Only’

Include me in “People here now” after I check in = ‘Enable’ checked

Things others share:

Friends can check me in to places = ‘Select one’ (Enabled/Disabled)

Ideally, one would have liked the new features to be set as per the user’s preference but again, Facebook has set it for you. One would have preferred a notification to enable the Places feature on user’s permission and then asked the user to set the above three settings to their own liking. Anyway, if you don’t like Places, just disable it as below by going into Account>Privacy Settings>Customise Settings


Media_http1bpblogspot_dhict


Often in business, due to competition the end user benefits. With personal data at stake, in this particular case it seems the cost is privacy.

Personally, ‘Friends Only’ setting is quite generic and ‘Friends’ on Facebook does not really accurately reflect real life relationships. Facebook 'Friends' can be classed into acquaintances, friends and close friends in real world. One may not want all 'Friends' to see status updates, locations, photos, videos etc. In this case, the ‘lists’ functionality within Facebook can be used to group friends into appropriate lists and fine tune the privacy settings to restrict sharing. But things may get a little complicated there. The privacy update earlier this year didn’t do much for the user in this regard. It was a well presented screen/facade to the same behind the scene workings that enticed the users to ‘recommended’ settings and silence the critics. Prof. Ross Anderson sums it up very aptly in an OWASP podcast , Facebook is trying something extremely difficult here and it is going to have to face many challenges in the road ahead.

For the readers interested further in Facebook Privacy Settings, Sophos has a good basic guide to the settings here.

Tuesday, 17 August 2010

Security training - make the message stick!

Security training programs are boring. I don’t have the statistics to support this claim but surely there are other things that employees have to do or are rather interested in than attend a one hour session instructing them on security policies and best practices! Think about a security professional sitting in an accounting training class. It is done because it has to be done with varied level of interest, mostly low.

And the aim of a training program, albeit any presentation is to engage the audience and deliver a message that sticks, preferably for a long period of time. Going through slide after slide of bullet points is just another presentation that the audience will never remember.


Media_httpuploadwikim_bfagu



I recently delivered a PCI training session to a few DBAs. The slide deck was ready to take them through the whole nine yards of PCI DSS, ensure that they understood what it meant and that it has 12 requirements they should be aware of. Having sat in their chairs before, I thought maybe there is a better way of delivering it to make the training more interesting. An impromptu decision, I thought a real story is always a great start.

So I decided to tell them the story about one particular individual named Albert Gonzalez. Now, he should’ve been classed as one big APT by looking at number of ‘results’ he had under his belt! :) His was more of a joint effort but the story of several breaches he led sticks, and it sticks well. The events are also directly/indirectly responsible for the onus on QSAs to store evidence while performing an assessment. I’m sure many of the QSAs don’t like Mr Gonzalez for that but I must thank him for providing such an entertaining story for my audience.

I didn’t have this transcript during the training but the conversation excerpt is a good insight into one of the many threats the standard aims to protect against. I must mention that A Gonzalez bought a 0-day but let’s not go there.

This also is the key approach in Security - realise the threats.

Once a good DBA knows what is the threat and what is being protected, it usually helps them understand the rationale behind the rigor of controls, their responsibilities and help watch out for suspicious events better.

Chances are, it may help them avoid installing the Facebook Dislike Button!

Friday, 13 August 2010

Blame the game #QSAs

QSAs get a lot of criticism from the security and business community. Following a recent discussion on QSA quality and how it can be improved in order to improve the client experience, I decided to look at the existing QSA qualification requirements and the course structure from PCI SSC site. This may help me understand whether the criticism is justified and what can be done to address this.


Cost

NEW QSA Training Fee - $1,250 USD
Annual Requalification Fee - $995 USD


Experience/Background

CISSP, CISA or CISM Certificate, or

5 Years of IT Security experience in a Resume’ format


Training


This class test is a closed book; the only document you will be allowed to reference during the test is a translation dictionary if needed.









Day 1

Module 1- PCI DSS Program Overview


o    PCI Security Standards Council


o    Roles & Responsibilities


o    Payment Industry Terminology


o    Payment Transaction Flow


o    Service Provider Relationships


o    Payment Brand Compliance Programs


o    SAQ Overview


o    PA-DSS Applicability



Module 2- PCI DSS Assessment Scoping


o    Cardholder Data Discovery


o    Cardholder Data Flow


o    Cardholder Data Storage


o    Network Segmentation


o    Scoping the Cardholder Data Environment



Day 2

Module 3- PCI DSS Requirements


o    PCI DSS v1.2 Overview


o    PCI DSS v1.2 Requirements


o    PCI DSS Assessment Preparation


o    Report of Compliance Documentation


o    Prioritized Approach for PCI DSS 1.2


Day 3

Module 4 - Compensating Controls


o    Compensating Controls Definitions


o    Compensating Controls Worksheet


o    Compensating Controls Examples


Team Case Studies

Exam



Module 5 - PCI DSS Compliance Program Development


o    Ten Common Myths of PCI DSS


o    PCI Compliance Process


o    PCI Compliance Recommendations


o    Information Security Management System Implementation (ISO 27001) 



I took the QSA training around mid 2008 and having sat for CISSP test before, I thought the QSA test was relatively light. I’m sure many others share my views and this partly reflects in the changes introduced in Jan 2010.


The test structure has since changed to become a closed book CBT test now with Continuing Education requirement (120 hours in 3 years, 20 min each year)


From a QSAC's (Company that employs QSAs - QSA 'C'ompany) point of view, they’d want as many qualified consultants to be QSAs to support the clients and the QSA training costs can be recouped in a few days of QSA consultancy. To add to this, the pre-requisites to qualify are not very stringent either. One needs to show 5 years of IT Security experience. I’m not sure whether this experience is validated directly by SSC. I believe it remains QSAC’s responsibility to do so.


Understanding the businesses, transaction process and the various relationships is not an easy one. It is not something that an individual with 5 years of IT Security experience and 3 day training and a test can readily address. It cannot be consistent either as the standard is open to interpretation. Acquirers have been proactive and have published guidance around call recording of SAD post authorisation to avoid mis-interpretation. Every acquirer has their own PCI team with their own interpretations and requirements. It becomes difficult for the QSAs to ‘sing from the same hymn sheet’.


To improve this, tiers/levels can be assigned to QSAs based on the assessments they do. I’m sure each QSA will agree that it is a learning process. No QSA knows a 100% of everything when he/she comes out of the training. If a freshly qualified QSA is assigned to a relatively difficult project without any support from a 'gold' or 'silver' QSA, that is down to QSAC and client's experience. Since it is an assessment/audit related role with technical emphasis, it may also help if certifications like CISSP, CISA or CISM are made mandatory. Most of the QSAs do have one of these certifications, hence it shouldn't be too onerous.


On the bright side, I have seen the QSA market mature over last year and QSACs are asking for experienced QSAs with several RoCs under their belt. This is a good thing. Organisations are also getting mature in addressing PCI compliance to engage QSAC early to provide experienced QSAs to lead towards a successful assessment. All this comes at a cost, a cost that some organisations may not have foreseen in their business plans. That’s when the fun begins. Dave did a great post on selecting your QSAs here.

Tuesday, 10 August 2010

Show me the data

I was watching National Geographic channel the other day learning how Philadelphia's SEPTA service that acts as a transit network is managed and maintained. The trains use an overhead power source that requires regular maintenance. Coupled with this, there are incidents occur that the authorities must respond to. Each out-of-service carriage costs the network and appropriate procedures are followed to ensure the safety of the passengers.

I enjoyed the episode and being a security professional, could quickly relate to the similar processes and controls in IT Security. The major difference was the impact or the experience not being as visual. Workers bringing down a power line to change a section of electrical cable and impacting the line during BAU is more tangible an experience to the viewer than a 0-day patch being applied to a Web Server that ends up effecting the business. The former is easier to explain too and the risks are more tangible i.e. carriage damage/financial loss due to delays or line failure or in worst case loss of lives. Talking about the risks of a compromise/malware outbreak and eventual data loss due to CVE-2010-1423 or CVE-2009-4324 used in Eleonore exploit kit will probably not have the same impact.


Media_httpfarm1static_wkpsd

Visibility is the key here - of the assets and the risks. Most organisations clearly struggle to identify and classify their data assets as the business grows and the related risks are blurred or suppressed by business objectives. The patching process is a clear victim here and it usually becomes secondary and takes a back seat in order to support the business. Obviously, offense is easier, more exciting and rewarding (to the malware authors) than defense.

PCI-DSS has succeeded to an extent to identifying the asset and listing requirements to protect it. I'm not sure patching is as simple as requirements 6.1 and 6.2 lead us to believe. I have seen businesses struggle in this area and go past the nets of QSAs to remain exposed.

Bits are not as exciting, if only I could colour my data red and green and see it going across the network! Cut the cable and see the red drip. I would like to see more innovations happen at an application level where the data is created. Why doesn't MS-Word or Excel ask me to mark/classify my data? It may be as simple as red and green. Surely this model has to change. May be the customer is not asking for such functionality or it isn't being asked enough. There is DLP and then there is Web DLP. Organisations try to cap data leaks from ports on the devices but miss the big port i.e. the Internet.

Wednesday, 28 July 2010

Observations from VZDBIR 2010





Media_http1bpblogspot_hzobz

VZDBIR Interesting Graphs
I spent some time today reading the VZDBIR report as many of us did and decided to collate a few interesting graphs from this report. Reports like this are a great way to understand and put some weight behind the argument to the business on actual threats and eventual loss to the business (record %). When collated, the findings provide an interesting view into these threats (whether it be advanced and/or persistent) and what we are dealing with in the security community.


Questions that came to mind following the review 
- How can the compliance world take such findings on board and improve the standards? 
- How can the regulations/requirements improve to put appropriate weight on critical areas in security instead of an across the board 'old school' playing field? 
- Is what is required for compliance enough to protect the business against these threats? (On a second thought, this question isn't worth answering ;))


Certainly, the report shows an ever changing threat landscape. I believe compliance can gain some weight in the security and business world by understanding and incorporating such reports into their standards and requirements.


It is not just in compliance (although it is one of the biggest headaches in the business), but also in Security management that these observations can help teams redesign and re-evaluate their security strategies and invest wisely to address the 'real' threats and improve their ROI.


Thoughts?

Monday, 5 July 2010

Testing and QA Challenges

How much of Testing and QA is enough?

This is a question many executives, senior management, IT managers et. al would be asking themselves.


With the recent failures at Apple and Toyota in this phase of development and/or production, the executives must have surely revisited this and have had discussions on where things can be improved to avoid such mistakes going forward.

Running a successful business is a challenging task. The management always works towards striking the right balance to deliver a product that satisfies the customer demand, ensure that the product is safe and secure and also, that it delivers what is promised. In doing so, the team must ensure right profit margins are maintained to ensure successful running and growth of the business. With increased competition in the market and reduced product shelf life, businesses are facing another challenge - Time to Market (TTM).


Media_httpopenphotone_mivce

I was involved with a public sector organization that faced a vaguely related challenge. This organization managed a big website that went through six week release cycles with new ‘products’ introduced to keep it competitive. With these frequent changes, came struggles to ensure safety and security of the visitors along with delivering the product(s) as promised. Obviously, with short release cycles came shorter Testing and QA windows. Eventually, in such an environment Testing and QA tends to gravitate towards being a checklist than what it is really intended to achieve.

Of course, with the shorter TTM and short Testing/QA durations, there are cost constraints to ensure the business is still viable. Surely, this is an important question - how much of it is enough? With no simple answers, I believe (and you can add to this) the bare minimum that an organisation can do is:

  • Effective communications between teams

  • Plan well and revise those plans as necessary

  • Use skilled staff for Testing and QA (don’t get diverted by an incident!)

  • Retest after remediation (this is often missed!)

Certainly, hardware tests should undergo more rigorous and prolonged testing cycles than software products. It is always smart to blame it on software! All one needs is a much awaited 'update'. No product recalls! I bet the next gen Toyota will come with a network port :) Until then, Apple will try and fiddle with the on-screen graphics and ‘set a higher bar’ ;)

Tuesday, 29 June 2010

Log Management/SIEM 101

I was asked this question by senior management recently - ‘How do we tackle the logging and audit problem?’ Certainly it’s a broad question with no straight answers. The context of the question was to address compliance and incident response strategy.

For organisations with this question, it is a big challenge to begin with, primarily because until recently there was no rigor around this security control to draw from experience. In this post I aim to cover pre-requisites to start the project. 


Media_httpuploadwikim_cxdjp

The pre-requisites:

1. Gather requirements


    • If you come across a project team that starts looking at the solutions or technology prior to establishing business requirements, it is doomed to fail.

    • Understand the drivers for Log Management, whether it is compliance, operations, incident response, investigations, et al and their specifics. Also, gather requirements from various business functions to gain their support/buy-in.

    2. Identify assets (The ‘what to log’ and the ‘why to log’)

      • Using the existing asset management program, identify what you want to log. This will also establish whether the business understands the infrastructure that is serving the key business processes. This is the challenging part and more often than not there are only a handful of people in the business who understand and can identify these systems. In worst cases, there isn’t anyone!


      • An option to help address this requirement is to look at the latest Business Continuity Plan and identify systems that are business critical and the failure of which will result in the most damage.

      • Prepare a draft to group these systems for a phased rollout


    3. Create retention and disposal policies
      • Once the key systems have been identified for the first phase of the project, discuss within the business data retention and disposal requirements, from both business and compliance perspectives.
    In later  posts, I’ll try and cover how to prepare for a phased journey into log management and work towards achieving benefits out of the project.

    Wednesday, 26 May 2010

    PCI DSS QSA, ISA and Iron Man 2


    Once upon a time, a PCI SSC member sat down watching the Iron Man sequel in cinema called Iron Man 2. Quite impressed by the character Ivan Vanko (played by Mickey Rourke), an idea struck to create a vaguely similar character for the QSA, and name it ISA. Internal employee will go through the PCI SSC ISA training and interpret the standard to create something called the ‘ISA jacket’. At the same time, somewhere in the world or locally, a QSA Company is busy creating a QSA who is working on the ‘QSA jacket’. When being assessed, the internal employee will wear the 'ISA jacket' to protect the assessed environment when QSA comes donning the ‘QSA jacket’. This jacket will also be useful when doing self assessments. There may be sparks when the two jackets meet, but the assessed organisation is intended to benefit. However, this ISA jacket doesn’t come cheap so organisations beware!



    Media_http1bpblogspot_cfyax


    New questions will be raised, arguments discussed and hopefully clarified and agreed with the intent to improve the security of the organisations – this is exactly what both jackets were created to achieve, stop the bad guys! Having thought of this again, the ISA seems more of the James Rhodes’ character played by Don Cheadle. Ivan Vanko (use Russian accent from here) is more like the Albert Gonzalez’s of the real world. :)

    More ISA info here: https://www.pcisecuritystandards.org/education/isa_training.shtml

    Wednesday, 19 May 2010

    Change control, security and PCI DSS


    A recent change control question by a colleague, observations over the past year with two clients and PCI DSS related blog coverage prompts me to write this post.


    The change control processes followed by the two clients, although implemented and in place could not be more different. One had a mature process with Change Management framework in place with Change Analysts reviewing the requests in the queue, with the awareness necessary to allocate these request on to the right business function for assessment/approval using a relatively mature tool that aided the process. The processes around regular reviews of Changes by key stakeholders were also in place to discuss the requests. The other client had an in-house tool developed to log and track requests. However, the framework on managing the requests was very weak with the Change management team not fully understanding the business and also not aware of who to assign the change to or when to close the request.


    PCI DSS covers Change Control related requirements primarily in 6.4 but it seems CC is not given the emphasis that is needed in the security community probably because it is seen more as a service management function. I believe in the below Security Programme Life Cycle diagram and I have come across discussions around this in blog posts quite frequently. Business environments change for various reasons and a good Change Control framework is not only important for agility and adaptiveness of the business but also for the benefit of Security management to maintain the security posture of the environment and in cases improve it too!




    Media_http2bpblogspot_ldhjh


    Monday, 17 May 2010

    SIEM Musings - Part 1

    The intent of this post is to provide the reader with the 'Then and Now' view of the SIEM market by using Gartner MQs over the past few years. As the first post of the blog, I also cover how logging and SIEM in general have interested me over these years.

    History

    It started as an internal project to create a log collection server with one of my previous employers. I was tasked with a challenge to create an easy to deploy, open source and, of course, secure logging solution that essentially acted as a syslog server, within six months. Using Debian LiveCD (Knoppix), SE Linux, VMWare and guidance from @craigbalding, the race against time began to create this extremely light, hardened and secure image that will be used as a cheap and easy syslog server to whoever who wants to deploy one in their business. Obviously, questions came up around what to log, how to log, when to log, why to log etc. Some areas were covered, some were missed and after a many weeks of hard work, the project was delivered. Although it was just a small step in the right direction for the organisation, the biggest benefit for me was the first-hand experience of developing an alpha 'log management' tool and of challenges around acceptance of such a solution in various shapes and sizes of business.

    Following this, a project followed to select a mature product that offer the SIEM capabilities. This was in 2006 and I got my hands on, what I believe, the first Gartner SIEM MQ for 1H06.

    Since then, I have been studying these reports closely, looking at the developments in the products, deploying some of these products and also working with businesses in various industry sectors to reap benefits from the logging solution of choice.

    SIEM MQs

    So, below are the MQs from 2006, 2008, 2009 and most recently 2010.




    Media_http3bpblogspot_adffl



    Media_http2bpblogspot_yjvbe



    Media_http4bpblogspot_cvqfe



    Media_http4bpblogspot_dlhhe

    And, below is a quick tabular assessment of the above MQs. This may give an idea on how the vendors have fared over the years in the eyes of Gartner analysts with their definition of SIEM. The vendors in green font have appeared in all 4 MQs, not necessarily in the same quadrant. Pardon the colour selection :).


    Media_http4bpblogspot_ijeki

    Hopefully, the above should provide businesses who are in the process of SIEM vendor selection a quick snapshot of where vendors are positioned in the MQ and their journeys within.

    Personally, I take MQ as a good starting point for vendor selection/assessment but it certainly shouldn't be the end or focus of it. As most product selection goes, this is a multiobjective optimization problem. Businesses have various constraints to make this selection against like cost, compliance, operations, incident response, audit, etc.

    In the next post, I would like to focus on vendor movements in the MQ and whether these movements makes sense. Probably, this may lead to the science behind MQ, who knows!;) What I am really interested to know is whether these movements are actually experienced by the organisations who use these products e.g. someone using Q1Labs since 2006.

    Thank you @rockyd for pointing me to 2010 MQ.