16
NETWORK IPS COMPARATIVE ANALYSIS Security 2013 – Thomas Skybakmoen, Jason Pappalexis Tested Products Check Point 13500, Dell SonicWALL SuperMassive E10800, Fortinet FortiGate3600C, HP TippingPoint S7500NX, IBM GX7800, Juniper SRX5800, McAfee NS9100, McAfee NS9200, Sourcefire 7120, Stonesoft 3206

SourceFire Analyze

Embed Size (px)

Citation preview

Page 1: SourceFire Analyze

 

NETWORK  IPS  COMPARATIVE  ANALYSIS    

Security    

2013  –  Thomas  Skybakmoen,  Jason  Pappalexis  

 

Tested  Products  Check  Point  13500,  Dell  SonicWALL  SuperMassive  E10800,  Fortinet  FortiGate-­‐3600C,  HP  TippingPoint  S7500NX,    IBM  GX7800,  Juniper  SRX5800,  McAfee  NS-­‐9100,  McAfee  NS-­‐9200,  Sourcefire  7120,  Stonesoft  3206  

Page 2: SourceFire Analyze

NSS  Labs   Intrusion  Prevention  System  Comparative  Analysis  -­‐  Security  

 

©  2013  NSS  Labs,  Inc.  All  rights  reserved.     2      

Overview  Implementation  of  an  intrusion  prevention  system  (IPS)  can  be  a  complex  process  with  multiple  factors  affecting  the  overall  security  effectiveness  of  the  solution.  These  should  be  considered  over  the  course  of  the  useful  life  of  the  solution,  and  include:  

1. Exploit  block  rate  2. Anti-­‐evasion  capabilities  (resistance  to  common  evasion  techniques)  3. Device  stability  and  reliability  4. Overall  manageability  (see  “Management  Comparative  Analysis  Report  (CAR)”)  

In  order  to  determine  the  relative  security  effectiveness  of  devices  on  the  market  and  facilitate  accurate  product  comparisons,  NSS  Labs  has  developed  a  unique  metric:  

Security  Effectiveness  =  Exploit  Block  Rate1  x  Anti-­‐Evasion  Rating  x  Stability  &  Reliability  

Figure  1  –  Security  Effectiveness  Formula  

By  focusing  on  overall  security  effectiveness  instead  of  the  exploit  block  rate  alone,  NSS  is  able  to  factor  in  the  ease  with  which  defenses  can  be  bypassed,  as  well  as  the  reliability  of  the  device.    

Product   Exploit  Block  Rate   Anti-­‐Evasion  Rating   Stability  &  Reliability   Security  Effectiveness  

Check  Point  13500   94%   100%   100%   94.4%  

Dell  SonicWALL  SuperMassive  E10800   95%   100%   100%   94.8%  

Fortinet  FortiGate-­‐3600C   94%   100%   100%   93.8%  

HP  TippingPoint  S7500NX   91%   100%   100%   91.1%  

IBM  GX7800   96%   100%   100%   95.7%  

Juniper  SRX5800   89%   100%   100%   89.2%  

McAfee  NS-­‐9100   95%   100%   100%   95.1%  

McAfee  NS-­‐9200   95%   100%   100%   95.1%  

Sourcefire  7120   98%   100%   100%   97.9%  

Stonesoft  3206   95%   100%   100%   94.7%  

Figure  2  –  Security  Effectiveness  

Because  enterprise  users  consider  effective  management  to  be  a  critical  component  of  any  enterprise  security  deployment,  this  also  should  be  factored  into  total  cost  of  ownership  (TCO)  and  overall  product  selection.  This  is  outside  the  scope  of  this  report,  however.  For  more  information,  refer  to  the  TCO  and  Management  CARs.  For  a  complete  view  of  Security  Effectiveness  mapped  against  Value,  refer  to  the  Security  Value  Map  (SVM)  CAR.  

NSS  research  indicates  that  the  majority  of  enterprises  tune  their  IPS.  Therefore,  for  NSS’  testing  of  IPS  products,  the  devices  are  deployed  with  a  tuned  policy.  Every  effort  is  made  to  deploy  policies  that  ensure  the  optimal  combination  of  security  effectiveness  and  performance,  as  would  be  the  aim  of  a  typical  customer  deploying  the  

                                                                                                                                       

1  Exploit  Block  Rate  is  defined  as  the  number  of  exploits  blocked  under  test  

Page 3: SourceFire Analyze

NSS  Labs   Intrusion  Prevention  System  Comparative  Analysis  -­‐  Security  

 

©  2013  NSS  Labs,  Inc.  All  rights  reserved.     3      

device  in  a  live  network  environment.  This  provides  readers  with  the  most  useful  information  on  key  IPS  security  effectiveness  and  performance  capabilities  based  upon  their  expected  usage.  

Evasion  techniques  are  a  means  of  disguising  and  modifying  attacks  in  order  to  avoid  detection  and  blocking  by  security  products.  Resistance  to  evasion  is  a  critical  component  in  an  IPS.  If  a  single  evasion  is  missed,  an  attacker  can  utilize  an  entire  class  of  exploits  to  circumvent  the  IPS,  rendering  it  virtually  useless.  Many  of  the  techniques  used  in  this  test  have  been  widely  known  for  years  and  should  be  considered  minimum  requirements  for  the  IPS  product  category,  while  others  are  more  recent.  This  particular  category  of  tests  is  critical  in  the  final  weighting  with  regard  to  product  guidance.  

This  chart  depicts  the  relationship  between  protection  and  performance  when  tuned  policies  are  used.  Farther  up  indicates  better  security  effectiveness,  and  farther  to  the  right  indicates  higher  throughput.  

 

Figure  3  –  Security  Effectiveness  And  Performance  

When  selecting  products,  those  along  the  top  line  of  the  chart  (closer  to  100%  security  effectiveness)  should  be  prioritized.  The  throughput  is  a  secondary  consideration  and  will  be  dependent  on  enterprise-­‐specific  deployment  requirements.  

Check&Point&13500&

Dell&SonicWALL&&SuperMassive&E10800&

For@net&For@GateB3600C&

HP&TippingPoint&S7500NX&

IBM&GX7800&

Juniper&SRX5800&

McAfee&NSB9100& McAfee&NSB9200&

Sourcefire&7120&

StonesoR&3206&

80%&

82%&

84%&

86%&

88%&

90%&

92%&

94%&

96%&

98%&

100%&

0& 5,000& 10,000& 15,000& 20,000& 25,000& 30,000& 35,000&

Security)Eff

ec,v

eness)

)

NSS1Tested)Throughput)(Mbps))

Page 4: SourceFire Analyze

NSS  Labs   Intrusion  Prevention  System  Comparative  Analysis  -­‐  Security  

 

©  2013  NSS  Labs,  Inc.  All  rights  reserved.     4      

Table  of  Contents  Overview  ................................................................................................................................  2  

Analysis  ..................................................................................................................................  5  Tuning  ...........................................................................................................................................................  5  Exploit  Block  Rate  .........................................................................................................................................  5  Exploit  Block  Rate  By  Year  ........................................................................................................................  5  Exploit  Block  Rate  By  Attack  Vector  ..........................................................................................................  6  Exploit  Block  Rate  By  Impact  Type  ............................................................................................................  8  Exploit  Block  Rate  By  Target  Vendor  .........................................................................................................  9  

Evasions  ........................................................................................................................................................  9  Stability  &  Reliability  ..................................................................................................................................  13  Security  Effectiveness  .................................................................................................................................  14  Managed  Security  Effectiveness  ................................................................................................................  15  

Contact  Information  ..............................................................................................................  16    

Table  of  Figures  Figure  1  –  Security  Effectiveness  Formula  ....................................................................................................................  2  

Figure  2  –  Security  Effectiveness  ..................................................................................................................................  2  

Figure  3  –  Security  Effectiveness  And  Performance  .....................................................................................................  3  

Figure  4  –  Exploit  Block  Rate  by  Year  –  Default  Policies  ...............................................................................................  6  

Figure  5  –  Attacker-­‐Initiated  Exploit  Block  Rate  ...........................................................................................................  7  

Figure  6  –  Target-­‐Initiated  Exploit  Block  Rate  ...............................................................................................................  7  

Figure  7  –  Overall  Exploit  Block  Rate  ............................................................................................................................  7  

Figure  8  –  Exploit  Block  Rate  By  Impact  Type  ...............................................................................................................  8  

Figure  9  —  Exploit  Block  Rate  by  Target  Vendor  ..........................................................................................................  9  

Figure  10  –  Exploits  And  Evasions  (Server-­‐Side)  .........................................................................................................  10  

Figure  11  –  Exploits  And  Evasions  (Client-­‐Side)  ..........................................................................................................  10  

Figure  12  –  Exploits  And  Evasions  (Combined)  ...........................................................................................................  10  

Figure  13  –  Evasion  Resistance  (I)  ...............................................................................................................................  11  

Figure  14  –  Evasion  Resistance  (II)  ..............................................................................................................................  11  

Figure  15  –  Overall  Evasion  Results  ............................................................................................................................  12  

Figure  16  –  Stability  And  Reliability  (I)  ........................................................................................................................  13  

Figure  17  –  Stability  And  Reliability  (II)  .......................................................................................................................  13  

Figure  18  –  Security  Effectiveness  ..............................................................................................................................  14  

Page 5: SourceFire Analyze

NSS  Labs   Intrusion  Prevention  System  Comparative  Analysis  –  Security  

 

©  2013  NSS  Labs,  Inc.  All  rights  reserved.     5      

Analysis    The  threat  landscape  is  evolving  constantly;  attackers  are  refining  their  strategies  and  increasing  both  the  volume  and  intelligence  of  their  attacks.  Enterprises  now  must  defend  against  targeted  persistent  attacks  (TPA).  In  the  past,  servers  were  the  main  target,  but  attacks  against  desktop  client  applications  are  now  mainstream  and  present  a  clear  danger  to  organizations.    

Tuning  

Security  products  are  often  complex,  and  vendors  are  responding  by  simplifying  the  user  interface  and  security  policy  selection  to  meet  the  usability  needs  of  a  broadening  user  base.  Indeed,  many  organizations  accept  and  deploy  the  default  settings,  understanding  these  to  be  the  best  recommendations  from  the  vendor.  In  this,  IPS  are  the  exception  to  the  rule.  NSS’  research  indicated  that  most,  if  not  all,  enterprises  tune  their  IPS.  In  general,  accepting  a  vendor’s  defaults  is  likely  to  result  in  the  omission  of  a  significant  number  of  deployment-­‐specific  signatures,  which  could  leave  an  organization  at  risk.  

With  the  shortage  of  skilled  and  experienced  practitioners,  it  is  important  to  consider  the  time  and  resources  required  to  properly  install,  maintain,  and  tune  the  solution.  Failure  to  do  so  could  result  in  products  not  achieving  their  full  security  potential.  Therefore,  all  IPS  products  are  tuned  prior  to  testing  to  eliminate  false  positives  and  provide  the  most  appropriate  coverage  for  the  systems  to  be  protected.  Typically,  tuning  is  carried  out  by  experienced  system  engineers  from  the  vendor  company,  but  where  this  is  not  possible,  NSS  engineers  will  perform  the  necessary  tuning.  NSS  engineers  may  also  amend  the  configuration  of  a  device  under  test  (DUT),  where  specific  characteristics  of  the  DUT  or  its  configuration  interfere  with  the  normal  operation  of  any  of  the  tests,  or  where  the  results  obtained  from  those  tests  would,  in  the  opinion  of  those  engineers,  misrepresent  the  true  capabilities  of  the  DUT.  Every  effort  is  made  to  ensure  the  optimal  combination  of  security  effectiveness  and  performance,  as  would  be  the  aim  of  a  typical  customer  deploying  the  DUT  in  a  live  network  environment.  

Tuning  an  IPS  is  a  potentially  complicated  endeavor  that  must  be  performed  uniquely  for  each  environment.  Many  factors  affect  which  signatures  or  rules  should  be  enabled  on  an  IPS,  for  example,  network  architecture,  target  assets,  patch  levels,  allowed  protocols,  etc.  

Exploit  Block  Rate  NSS’  security  effectiveness  testing  leverages  the  deep  expertise  of  our  engineers  to  generate  the  same  types  of  attacks  used  by  modern  cyber  criminals,  utilizing  multiple  commercial,  open  source,  and  proprietary  tools  as  appropriate.  With  over  1800  live  exploits,  this  is  the  industry’s  most  comprehensive  test  to  date.  Most  notable,  all  of  the  live  exploits  and  payloads  in  these  tests  have  been  validated  such  that  a:  

• A  reverse  shell  is  returned  • A  bind  shell  is  opened  on  the  target  allowing  the  attacker  to  execute  arbitrary  commands  • A  malicious  payload  is  installed  • The  system  is  rendered  unresponsive  

Exploit  Block  Rate  By  Year  

Contrary  to  popular  belief,  the  biggest  risks  are  not  always  driven  by  the  latest  “Patch  Tuesday”  disclosures.  NSS’  threat  research  reveals  that  many  older  attacks  are  still  in  circulation,  and  therefore  remain  relevant.  

Page 6: SourceFire Analyze

NSS  Labs   Intrusion  Prevention  System  Comparative  Analysis  –  Security  

 

©  2013  NSS  Labs,  Inc.  All  rights  reserved.     6      

Different  vendors  take  different  approaches  to  adding  coverage  once  a  vulnerability  is  disclosed.  An  attempt  to  provide  rapid  coverage  for  vulnerabilities  that  are  not  fully  understood  can  result  in  multiple  exploit-­‐specific  signatures  that  may  be  inaccurate,  ineffective,  or  prone  to  false  positives.  Vendors  that  have  the  resources  available  to  fully  research  a  vulnerability  will  hopefully  produce  vulnerability-­‐oriented  signatures  that  provide  coverage  for  all  exploits  written  to  take  advantage  of  that  flaw.  This  approach  provides  more  effective  coverage  with  fewer  false  positives.  

Where  a  product  has  performance  limitations,  vendors  may  retire  older  signatures  in  an  attempt  to  alleviate  those  limitations,  resulting  in  inconsistent  coverage  for  older  vulnerabilities.  This  results  in  varying  levels  of  protection  across  products.  The  following  table  classifies  coverage  by  disclosure  date,  as  tracked  by  CVE  numbers.  The  table  is  sorted  by  total  protection,  and  the  green  sections  of  the  heat  map  indicate  vendors  with  higher  coverage  for  the  given  year  (columns).  

Product   <=2004   2005   2006   2007   2008   2009   2010   2011   2012   2013   Total  

Check  Point  13500   100.0%   96.9%   98.9%   99.2%   96.9%   97.3%   93.4%   75.2%   90.2%   68.0%   94.4%  

Dell  SonicWALL  SuperMassive  E10800  

100.0%   100.0%   100.0%   99.6%   97.8%   98.4%   93.1%   71.8%   90.2%   64.0%   94.8%  

Fortinet  FortiGate-­‐3600C   100.0%   97.9%   98.9%   97.7%   97.5%   96.8%   94.0%   70.9%   87.8%   60.0%   93.8%  

HP  TippingPoint  S7500NX   93.3%   95.8%   97.9%   96.2%   93.4%   93.5%   88.2%   70.9%   87.8%   60.0%   91.1%  

IBM  GX7800   100.0%   97.4%   100.0%   98.9%   97.8%   97.8%   94.6%   82.1%   90.7%   92.0%   95.7%  

Juniper  SRX5800   100.0%   90.6%   93.2%   91.6%   92.8%   94.6%   89.1%   70.9%   82.9%   68.0%   89.2%  

McAfee  NS-­‐9100   100.0%   99.5%   99.5%   99.6%   97.8%   97.8%   93.4%   75.2%   91.7%   68.0%   95.1%  

McAfee  NS-­‐9200   100.0%   99.5%   99.5%   99.6%   97.8%   97.8%   93.4%   75.2%   91.7%   68.0%   95.1%  

Sourcefire  7120   100.0%   99.5%   100.0%   99.2%   99.4%   99.5%   98.2%   88.9%   93.2%   100.0%   97.9%  

Stonesoft  3206   80.0%   94.3%   96.3%   96.2%   93.8%   94.6%   97.9%   80.3%   97.1%   96.0%   94.7%  

Figure  4  –  Exploit  Block  Rate  by  Year  –  Default  Policies  

Exploit  Block  Rate  By  Attack  Vector  

Exploits  can  be  initiated  either  locally  by  the  target  (desktop  client)  or  remotely  by  the  attacker  against  a  server.  Since  2007,  NSS  researchers  have  observed  a  dramatic  rise  in  the  number  of  client-­‐side  exploits,  because  these  can  be  easily  launched  by  an  unsuspecting  user  who  visits  an  infected  website.  Historically,  IPS  products  did  not  focus  on  these  types  of  attacks,  which  were  deemed  to  be  the  responsibility  of  antivirus  products.  

This  approach  is  no  longer  viewed  as  acceptable  and,  despite  the  difficulty  of  providing  extensive  coverage  for  client-­‐side  attacks,  the  IPS  industry  has  attempted  to  provide  more  complete  client-­‐side  coverage.    

NSS  utilizes  the  following  definitions:    

Attacker-­‐Initiated:  The  threat/exploit  is  executed  by  the  attacker  remotely  against  a  vulnerable  application  and/or  operating  system.  These  attacks  traditionally  target  servers  (which  is  why  they  are  often  referred  to  as  server-­‐side  attacks).  

Target-­‐Initiated:  The  threat/exploit  is  initiated  by  the  vulnerable  target  (which  is  why  they  are  often  referred  to  as  client-­‐side  attacks).  The  attacker  has  little  or  no  control  as  to  when  the  target  user  or  application  will  execute  the  threat.  These  attacks  traditionally  target  desktop  client  applications.  Target  examples  include  Internet  Explorer,  Adobe,  Firefox,  QuickTime,  Office  applications,  etc.  

Page 7: SourceFire Analyze

NSS  Labs   Intrusion  Prevention  System  Comparative  Analysis  –  Security  

 

©  2013  NSS  Labs,  Inc.  All  rights  reserved.     7      

 

Figure  5  –  Attacker-­‐Initiated  Exploit  Block  Rate  

 

Figure  6  –  Target-­‐Initiated  Exploit  Block  Rate  

 

Figure  7  –  Overall  Exploit  Block  Rate  

96.1%& 97.6%& 96.3%&91.7%&

97.7%&

90.3%&

97.5%& 97.5%& 99.3%&

93.8%&

0%&

10%&

20%&

30%&

40%&

50%&

60%&

70%&

80%&

90%&

100%&

Check&Point&13500& Dell&SonicWALL&SuperMassive&

E10800&

ForGnet&ForGGateI3600C&

HP&TippingPoint&S7500NX&

IBM&GX7800& Juniper&SRX5800& McAfee&NSI9100& McAfee&NSI9200& Sourcefire&7120& StonesoU&3206&

92.9%% 92.5%% 91.7%% 90.6%%94.1%%

88.3%%93.2%% 93.2%%

96.7%% 95.4%%

0%%

10%%

20%%

30%%

40%%

50%%

60%%

70%%

80%%

90%%

100%%

Check%Point%13500% Dell%SonicWALL%SuperMassive%

E10800%

ForGnet%ForGGateI3600C%

HP%TippingPoint%S7500NX%

IBM%GX7800% Juniper%SRX5800% McAfee%NSI9100% McAfee%NSI9200% Sourcefire%7120% StonesoU%3206%

94.4%% 94.8%% 93.8%%91.1%%

95.7%%

89.2%%

95.1%% 95.1%%97.9%%

94.7%%

0%%

10%%

20%%

30%%

40%%

50%%

60%%

70%%

80%%

90%%

100%%

Check%Point%13500% Dell%SonicWALL%SuperMassive%

E10800%

ForGnet%ForGGateI3600C%

HP%TippingPoint%S7500NX%

IBM%GX7800% Juniper%SRX5800% McAfee%NSI9100% McAfee%NSI9200% Sourcefire%7120% StonesoU%3206%

Page 8: SourceFire Analyze

NSS  Labs   Intrusion  Prevention  System  Comparative  Analysis  –  Security  

 

©  2013  NSS  Labs,  Inc.  All  rights  reserved.     8      

NSS’  research  indicates  that  most  enterprises  are  forced  to  support  a  heterogeneous  mix  of  desktop  client  applications.  Further,  enterprise  IT  departments  are  often  unable  to  positively  identify  which  client  applications  are  running  on  their  employees’  desktops,  and  which  are  not.    

This  research  provides  new  clarity  regarding  tuning  best  practices  and  indicates  that  it  is  still  necessary  to  tune  an  IPS  protecting  servers  in  a  DMZ  or  data  center.  Research  also  indicates  that  with  regard  to  using  an  IPS  to  protecting  desktop  client  applications,  enterprises  are  discovering  that  it  is  often  best  to  enable  a  (nearly)  full  complement  of  signatures,  since  it  is  not  feasible  to  tune  an  IPS  based  on  specific  desktop  client  applications.    

Given  the  rapid  evolution  of  criminal  activity  targeting  desktop  client  applications,  enterprises  will  be  required  to  dedicate  more  resources  to  client-­‐side  protection  in  2013  and  2014.  

Exploit  Block  Rate  By  Impact  Type  

The  most  serious  exploits  are  those  that  result  in  a  remote  system  compromise,  providing  the  attacker  with  the  ability  to  execute  arbitrary  system-­‐level  commands.  Most  exploits  in  this  class  are  “weaponized”  and  offer  the  attacker  a  fully  interactive  remote  shell  on  the  target  client  or  server.    

Slightly  less  serious  are  attacks  that  result  in  an  individual  service  compromise,  but  not  arbitrary  system-­‐level  command  execution.  Typical  attacks  in  this  category  include  service-­‐specific  attacks,  such  as  SQL  injection,  that  enable  an  attacker  to  execute  arbitrary  SQL  commands  within  the  database  service.  These  attacks  are  somewhat  isolated  to  the  service  and  do  not  immediately  result  in  full  system-­‐level  access  to  the  operating  system  and  all  services.  However,  by  using  additional  localized  system  attacks,  it  may  be  possible  for  the  attacker  to  escalate  from  the  service  level  to  the  system  level.    

Finally,  there  are  the  attacks  (often  target  initiated)  which  result  in  a  system  or  service-­‐level  fault  that  crashes  the  targeted  service  or  application  and  requires  administrative  action  to  restart  the  service  or  reboot  the  system.  These  attacks  do  not  enable  the  attacker  to  execute  arbitrary  commands.  Still,  the  resulting  impact  to  the  business  could  be  severe,  as  the  attacker  could  crash  a  protected  system  or  service.    

Product   System  Exposure   Service  Exposure   System  or  Service  Fault  

Check  Point  13500   93.7%   98.3%   97.9%  

Dell  SonicWALL  SuperMassive  E10800   93.9%   100.0%   100.0%  

Fortinet  FortiGate-­‐3600C   93.1%   98.3%   97.2%  

HP  TippingPoint  S7500NX   91.2%   96.7%   84.8%  

IBM  GX7800   95.3%   99.2%   97.2%  

Juniper  SRX5800   88.8%   88.3%   94.5%  

McAfee  NS-­‐9100   94.4%   100.0%   98.6%  

McAfee  NS-­‐9200   94.4%   100.0%   98.6%  

Sourcefire  7120   97.6%   100.0%   99.3%  

Stonesoft  3206   95.6%   93.3%   85.5%  

Figure  8  –  Exploit  Block  Rate  By  Impact  Type  

See  individual  Product  Analysis  Reports  (PAR)  for  more  information.  

Page 9: SourceFire Analyze

NSS  Labs   Intrusion  Prevention  System  Comparative  Analysis  –  Security  

 

©  2013  NSS  Labs,  Inc.  All  rights  reserved.     9      

Exploit  Block  Rate  By  Target  Vendor  

The  NSS  exploit  library  covers  a  wide  range  of  protocols  and  applications  representing  a  wide  range  of  software  vendors.  This  chart  shows  coverage  for  5  of  the  top  vendor  targets  (out  of  more  than  70),  as  determined  by  the  number  of  vendor-­‐specific  exploits  in  the  NSS  exploit  library  for  this  round  of  testing.  

Description   Microsoft   Oracle   Apple   IBM   Adobe  

Check  Point  13500   99.5%   98.7%   100.0%   100.0%   100.0%  

Dell  SonicWALL  SuperMassive  E10800   100.0%   100.0%   100.0%   100.0%   100.0%  

Fortinet  FortiGate-­‐3600C   99.1%   97.3%   95.7%   100.0%   93.3%  

HP  TippingPoint  S7500NX   95.7%   97.3%   97.1%   93.3%   97.8%  

IBM  GX7800   99.5%   100.0%   98.6%   97.8%   100.0%  

Juniper  SRX5800   95.4%   77.3%   95.7%   97.8%   93.3%  

McAfee  NS-­‐9100   99.1%   100.0%   100.0%   100.0%   97.8%  

McAfee  NS-­‐9200   99.1%   100.0%   100.0%   100.0%   97.8%  

Sourcefire  7120   100.0%   100.0%   97.1%   100.0%   100.0%  

Stonesoft  3206   96.8%   96.0%   98.6%   93.3%   100.0%  

Figure  9  —  Exploit  Block  Rate  by  Target  Vendor  

See  individual  Product  Analysis  Reports  (PAR)  for  more  information.  

Evasions  

Evasion  techniques  are  a  means  of  disguising  and  modifying  attacks  at  the  point  of  delivery  in  order  to  avoid  detection  and  blocking  by  security  products.  Failure  of  a  security  device  to  handle  correctly  a  particular  type  of  evasion  potentially  will  allow  an  attacker  to  use  an  entire  class  of  exploits  for  which  the  device  is  assumed  to  have  protection.  This  renders  the  device  virtually  useless.  Many  of  the  techniques  used  in  this  test  have  been  widely  known  for  years  and  should  be  considered  minimum  requirements  for  the  IPS  product  category.    

Providing  exploit  protection  results  without  fully  factoring  in  evasion  can  be  misleading.  The  more  classes  of  evasion  that  are  missed—IP  fragmentation,  TCP  segmentation,  RPC  fragmentation,  URL  obfuscation,  TCP  split  handshake  and  FTP  evasion—the  less  effective  the  device.  For  example,  it  is  better  to  miss  all  techniques  in  one  evasion  category  (say,  FTP  evasion)  than  one  technique  in  each  category,  which  would  result  in  a  broader  attack  surface.    

Furthermore,  evasions  operating  at  the  lower  layers  of  the  network  stack  (IP  fragmentation  or  TCP  segmentation)  will  have  a  greater  impact  on  security  effectiveness  than  those  operating  at  the  upper  layers  (HTTP  or  FTP  obfuscation.)  This  is  because  lower-­‐level  evasions  will  impact  potentially  a  wider  number  of  exploits;  therefore,  missing  TCP  segmentation  is  a  much  more  serious  issue  than  missing  FTP  obfuscation.    

A  product’s  effectiveness  is  significantly  handicapped  if  it  fails  to  detect  exploits  that  employ  obfuscation  or  evasion  techniques,  and  the  NSS  product  guidance  is  adjusted  to  reflect  this.  

As  with  exploits,  evasions  can  be  employed  specifically  to  obfuscate  attacks  that  are  initiated  either  locally  by  the  target  (client-­‐side),  or  remotely  by  the  attacker  against  a  server  (server-­‐side).  Some  evasions  are  equally  effective  

Page 10: SourceFire Analyze

NSS  Labs   Intrusion  Prevention  System  Comparative  Analysis  –  Security  

 

©  2013  NSS  Labs,  Inc.  All  rights  reserved.     10      

when  used  with  both  server-­‐side  and  client-­‐side  attacks.  See  section  on  Exploit  Block  Rate  By  Attack  Vector,  for  more  information.    

The  following  charts  depict  attacker-­‐  and  target-­‐initiated  exploits  and  evasions  combined.    

 

Figure  10  –  Exploits  And  Evasions  (Server-­‐Side)  

 

Figure  11  –  Exploits  And  Evasions  (Client-­‐Side)  

 

Figure  12  –  Exploits  And  Evasions  (Combined)  

96.1%& 97.6%& 96.3%&91.7%&

97.7%&

90.3%&

97.5%& 97.5%& 99.3%&

93.8%&

0%&

10%&

20%&

30%&

40%&

50%&

60%&

70%&

80%&

90%&

100%&

Check&Point&13500& Dell&SonicWALL&SuperMassive&

E10800&

ForGnet&ForGGateI3600C&

HP&TippingPoint&S7500NX&

IBM&GX7800& Juniper&SRX5800& McAfee&NSI9100& McAfee&NSI9200& Sourcefire&7120& StonesoU&3206&

92.9%% 92.5%% 91.7%% 90.6%%94.1%%

88.3%%93.2%% 93.2%%

96.7%% 95.4%%

0%%

10%%

20%%

30%%

40%%

50%%

60%%

70%%

80%%

90%%

100%%

Check%Point%13500% Dell%SonicWALL%SuperMassive%

E10800%

ForGnet%ForGGateI3600C%

HP%TippingPoint%S7500NX%

IBM%GX7800% Juniper%SRX5800% McAfee%NSI9100% McAfee%NSI9200% Sourcefire%7120% StonesoU%3206%

94.4%% 94.8%% 93.8%%91.1%%

95.7%%

89.2%%

95.1%% 95.1%%97.9%%

94.7%%

0%%

10%%

20%%

30%%

40%%

50%%

60%%

70%%

80%%

90%%

100%%

Check%Point%13500% Dell%SonicWALL%SuperMassive%

E10800%

ForGnet%ForGGateI3600C%

HP%TippingPoint%S7500NX%

IBM%GX7800% Juniper%SRX5800% McAfee%NSI9100% McAfee%NSI9200% Sourcefire%7120% StonesoU%3206%

Page 11: SourceFire Analyze

NSS  Labs   Intrusion  Prevention  System  Comparative  Analysis  –  Security  

 

©  2013  NSS  Labs,  Inc.  All  rights  reserved.     11      

Note  that  in  the  case  of  the  Fortinet  FortiGate-­‐3600C,  the  lower  block  rates  on  client-­‐side  attacks  when  compared  to  competing  products  were  the  reason  for  the  “below-­‐average”  result  for  Security  Effectiveness  (see  the  Security  Value  Map  CAR).  However,  performance  of  this  device  was  otherwise  above  average;  therefore,  if  client-­‐side  attacks  are  not  considered  relevant  (i.e.,  in  a  pure  server  environment)  this  issue  should  be  removed  from  consideration.  

The  following  figures  provide  details  on  evasion  resistance  for  the  tested  products.  

Product   IP  Packet  Fragmentation  

TCP  Stream  Segmentation  

RPC  Fragmentation  

SMB  &  NetBIOS  Evasions  

URL  Obfuscation  

HTML  Obfuscation  

Check  Point  13500   PASS   PASS   PASS   PASS   PASS   PASS  

Dell  SonicWALL  SuperMassive  E10800   PASS   PASS   PASS   PASS   PASS   PASS  

Fortinet  FortiGate-­‐3600C   PASS   PASS   PASS   PASS   PASS   PASS  

HP  TippingPoint  S7500NX   PASS   PASS   PASS   PASS   PASS   PASS  

IBM  GX7800   PASS   PASS   PASS   PASS   PASS   PASS  

Juniper  SRX5800   PASS   PASS   PASS   PASS   PASS   PASS  

McAfee  NS-­‐9100   PASS   PASS   PASS   PASS   PASS   PASS  

McAfee  NS-­‐9200   PASS   PASS   PASS   PASS   PASS   PASS  

Sourcefire  7120   PASS   PASS   PASS   PASS   PASS   PASS  

Stonesoft  3206   PASS   PASS   PASS   PASS   PASS   PASS  

Figure  13  –  Evasion  Resistance  (I)  

Product  Payload  Encoding  

FTP  Evasion  

IP  Frag  +  TCP  Seg  

IP  Frag  +  MSRPC  Frag  

IP  Frag  +    SMB  

Evasions  

TCP  Seg  +  SMB  /  NetBIOS  Evasions  

TCP  Split  Handshake  

Check  Point  13500   PASS   PASS   PASS   PASS   PASS   PASS   PASS  

Dell  SonicWALL  SuperMassive  E10800   PASS   PASS   PASS   PASS   PASS   PASS   PASS  

Fortinet  FortiGate-­‐3600C   PASS   PASS   PASS   PASS   PASS   PASS   PASS  

HP  TippingPoint  S7500NX   PASS   PASS   PASS   PASS   PASS   PASS   PASS  

IBM  GX7800   PASS   PASS   PASS   PASS   PASS   PASS   PASS  

Juniper  SRX5800   PASS   PASS   PASS   PASS   PASS   PASS   PASS  

McAfee  NS-­‐9100   PASS   PASS   PASS   PASS   PASS   PASS   PASS  

McAfee  NS-­‐9200   PASS   PASS   PASS   PASS   PASS   PASS   PASS  

Sourcefire  7120   PASS   PASS   PASS   PASS   PASS   PASS   PASS  

Stonesoft  3206   PASS   PASS   PASS   PASS   PASS   PASS   PASS  

Figure  14  –  Evasion  Resistance  (II)  

 

 

 

 

 

Page 12: SourceFire Analyze

NSS  Labs   Intrusion  Prevention  System  Comparative  Analysis  –  Security  

 

©  2013  NSS  Labs,  Inc.  All  rights  reserved.     12      

Product   Overall  Evasion  Results  

Check  Point  13500   PASS  

Dell  SonicWALL  SuperMassive  E10800   PASS  

Fortinet  FortiGate-­‐3600C   PASS  

HP  TippingPoint  S7500NX   PASS  

IBM  GX7800   PASS  

Juniper  SRX5800   PASS  

McAfee  NS-­‐9100   PASS  

McAfee  NS-­‐9200   PASS  

Sourcefire  7120   PASS  

Stonesoft  3206   PASS  

Figure  15  –  Overall  Evasion  Results  

All  devices  proved  effective  against  all  evasion  techniques  tested.  However,  it  should  be  noted  that  although  the  Check  Point  13500  proved  effective  against  all  evasion  techniques  tested,  resistance  to  several  SMB/NetBIOS  evasion  techniques  was  only  achieved  via  the  deployment  of  generic  signatures  that  are  unrelated  to  evasion  protection  and  would  be  prone  to  causing  false  positive  alerts  in  live  corporate  networks.  Until  this  issue  is  remediated  by  Check  Point,  potential  purchasers  should  consider  carefully  the  deployment  of  this  device  in  situations  that  require  complete  SMB/NetBIOS  protection.  Existing  customers  with  this  requirement  should  press  Check  Point  for  a  rapid  solution.  

Page 13: SourceFire Analyze

NSS  Labs   Intrusion  Prevention  System  Comparative  Analysis  –  Security  

 

©  2013  NSS  Labs,  Inc.  All  rights  reserved.     13      

Stability  &  Reliability  Long-­‐term  stability  is  particularly  important  for  an  in-­‐line  device,  where  failure  can  produce  network  outages.  These  tests  verify  the  stability  of  the  DUT  along  with  its  ability  to  maintain  security  effectiveness  while  under  normal  load  and  while  passing  malicious  traffic.  Products  that  are  not  able  to  sustain  legitimate  traffic  (or  that  crash)  while  under  hostile  attack  will  not  pass.  

The  DUT  is  required  to  remain  operational  and  stable  throughout  these  tests,  and  to  block  100%  of  previously  blocked  traffic,  raising  an  alert  for  each  attack.  If  any  prohibited  traffic  passes  successfully,  caused  by  either  the  volume  of  traffic  or  the  device  under  test  failing  open  for  any  reason,  this  will  result  in  a  FAIL.  

Product  

Blocking  Under  

Extended  Attack  

Passing  Legitimate  Traffic  Under  Extended  

Attack  

Behavior  Of  The  State  Engine  

Under  Load  

Attack  Detection/Blocking  –  Normal  Load  

State  Preservation  –  Normal  Load  

Pass  Legitimate  Traffic  –  

Normal  Load  

Check  Point  13500   PASS   PASS   PASS   PASS   PASS   PASS  

Dell  SonicWALL  SuperMassive  E10800   PASS   PASS   PASS   PASS   PASS   PASS  

Fortinet  FortiGate-­‐3600C   PASS   PASS   PASS   PASS   PASS   PASS  

HP  TippingPoint  S7500NX   PASS   PASS   PASS   PASS   PASS   PASS  

IBM  GX7800   PASS   PASS   PASS   PASS   PASS   PASS  

Juniper  SRX5800   PASS   PASS   PASS   PASS   PASS   PASS  

McAfee  NS-­‐9100   PASS   PASS   PASS   PASS   PASS   PASS  

McAfee  NS-­‐9200   PASS   PASS   PASS   PASS   PASS   PASS  

Sourcefire  7120   PASS   PASS   PASS   PASS   PASS   PASS  

Stonesoft  3206   PASS   PASS   PASS   PASS   PASS   PASS  

Figure  16  –  Stability  And  Reliability  (I)  

Product  

State  Preservation  -­‐  Maximum  Exceeded  

Drop  Traffic  –  Maximum  Exceeded  

Protocol  Fuzzing  &  Mutation  

Power  Fail   Persistence  of  Data  

Stability  And  Reliability  Score  

Check  Point  13500   PASS   PASS   PASS   PASS   YES   PASS  

Dell  SonicWALL  SuperMassive  E10800   PASS   PASS   PASS   PASS   YES   PASS  

Fortinet  FortiGate-­‐3600C   PASS   PASS   PASS   PASS   YES   PASS  

HP  TippingPoint  S7500NX   PASS   PASS   PASS   PASS   YES   PASS  

IBM  GX7800   PASS   PASS   PASS   PASS   YES   PASS  

Juniper  SRX5800   PASS   PASS   PASS   PASS   YES   PASS  

McAfee  NS-­‐9100   PASS   PASS   PASS   PASS   YES   PASS  

McAfee  NS-­‐9200   PASS   PASS   PASS   PASS   YES   PASS  

Sourcefire  7120   PASS   PASS   PASS   PASS   YES   PASS  

Stonesoft  3206   PASS   PASS   PASS   PASS   YES   PASS  

Figure  17  –  Stability  And  Reliability  (II)  

Page 14: SourceFire Analyze

NSS  Labs   Intrusion  Prevention  System  Comparative  Analysis  –  Security  

 

©  2013  NSS  Labs,  Inc.  All  rights  reserved.     14      

Security  Effectiveness  The  security  effectiveness  of  a  device  is  determined  by  factoring  the  results  of  evasions  testing  and  stability  &  reliability  testing  into  the  exploit  block  rate.  Figure  18  depicts  the  security  effectiveness  of  each  device.    

Product   Exploit  Block  Rate   Anti-­‐Evasion  Rating   Stability  &  Reliability   Security  Effectiveness  

Check  Point  13500   94%   100%   100%   94.4%  

Dell  SonicWALL  SuperMassive  E10800   95%   100%   100%   94.8%  

Fortinet  FortiGate-­‐3600C   94%   100%   100%   93.8%  

HP  TippingPoint  S7500NX   91%   100%   100%   91.1%  

IBM  GX7800   96%   100%   100%   95.7%  

Juniper  SRX5800   89%   100%   100%   89.2%  

McAfee  NS-­‐9100   95%   100%   100%   95.1%  

McAfee  NS-­‐9200   95%   100%   100%   95.1%  

Sourcefire  7120   98%   100%   100%   97.9%  

Stonesoft  3206   95%   100%   100%   94.7%  

Figure  18  –  Security  Effectiveness  

Page 15: SourceFire Analyze

NSS  Labs   Intrusion  Prevention  System  Comparative  Analysis  –  Security  

 

©  2013  NSS  Labs,  Inc.  All  rights  reserved.     15      

Managed  Security  Effectiveness

Security  devices  are  complicated  to  deploy;  essential  systems  such  as  centralized  management  console  options,  log  aggregation,  and  event  correlation/management  systems  further  complicate  the  purchasing  decision.  It  is  vital  that  enterprise  security  professionals  are  able  to  deploy  and  manage  multiple  firewalls  throughout  the  organization  in  a  secure  and  effective  manner.  If  a  device  cannot  be  managed  effectively,  the  security  effectiveness  of  that  device  is  compromised.  

As  part  of  this  test,  NSS  performed  in-­‐depth  technical  evaluations  of  the  main  features  and  capabilities  of  the  enterprise  management  systems  offered  by  each  vendor,  covering  the  following  key  areas:  

• General  Management  and  Configuration  –  how  easy  is  it  to  install  and  configure  devices,  and  deploy  multiple  devices  throughout  a  large  enterprise  network?  

• Policy  Handling  –  how  easy  is  it  to  create,  edit,  and  deploy  complicated  security  policies  across  an  enterprise?  

• Alert  Handling  –  how  accurate  and  timely  is  the  alerting,  and  how  easy  is  it  to  drill  down  to  locate  critical  information  needed  to  remediate  a  security  problem?  

• Reporting  –  how  effective  is  the  reporting  capability,  and  how  readily  can  it  be  customized?  

The  results  of  these  tests  are  reported,  along  with  detailed  cost  models,  in  the  Management  CAR  and  Total  Cost  of  Ownership  (TCO)  CAR.

Page 16: SourceFire Analyze

NSS  Labs   Intrusion  Prevention  System  Comparative  Analysis  –  Security  

 

©  2013  NSS  Labs,  Inc.  All  rights  reserved.     16      

©  2013  NSS  Labs,  Inc.  All  rights  reserved.  No  part  of  this  publication  may  be  reproduced,  photocopied,  stored  on  a  retrieval  system,  or  transmitted  without  the  express  written  consent  of  the  authors.    

Please  note  that  access  to  or  use  of  this  report  is  conditioned  on  the  following:  

1.  The  information  in  this  report  is  subject  to  change  by  NSS  Labs  without  notice.  

2.  The  information  in  this  report  is  believed  by  NSS  Labs  to  be  accurate  and  reliable  at  the  time  of  publication,  but  is  not  guaranteed.  All  use  of  and  reliance  on  this  report  are  at  the  reader’s  sole  risk.  NSS  Labs  is  not  liable  or  responsible  for  any  damages,  losses,  or  expenses  arising  from  any  error  or  omission  in  this  report.  

3.  NO  WARRANTIES,  EXPRESS  OR  IMPLIED  ARE  GIVEN  BY  NSS  LABS.  ALL  IMPLIED  WARRANTIES,  INCLUDING  IMPLIED  WARRANTIES  OF  MERCHANTABILITY,  FITNESS  FOR  A  PARTICULAR  PURPOSE,  AND  NON-­‐INFRINGEMENT  ARE  DISCLAIMED  AND  EXCLUDED  BY  NSS  LABS.  IN  NO  EVENT  SHALL  NSS  LABS  BE  LIABLE  FOR  ANY  CONSEQUENTIAL,  INCIDENTAL  OR  INDIRECT  DAMAGES,  OR  FOR  ANY  LOSS  OF  PROFIT,  REVENUE,  DATA,  COMPUTER  PROGRAMS,  OR  OTHER  ASSETS,  EVEN  IF  ADVISED  OF  THE  POSSIBILITY  THEREOF.  

4.  This  report  does  not  constitute  an  endorsement,  recommendation,  or  guarantee  of  any  of  the  products  (hardware  or  software)  tested  or  the  hardware  and  software  used  in  testing  the  products.  The  testing  does  not  guarantee  that  there  are  no  errors  or  defects  in  the  products  or  that  the  products  will  meet  the  reader’s  expectations,  requirements,  needs,  or  specifications,  or  that  they  will  operate  without  interruption.    

5.  This  report  does  not  imply  any  endorsement,  sponsorship,  affiliation,  or  verification  by  or  with  any  organizations  mentioned  in  this  report.    

6.  All  trademarks,  service  marks,  and  trade  names  used  in  this  report  are  the  trademarks,  service  marks,  and  trade  names  of  their  respective  owners.    

Test  Methodology  Methodology  Version:  Network  Intrusion  Prevention  Systems  (IPS)  v7.2  

A  copy  of  the  test  methodology  is  available  on  the  NSS  Labs  website  at  www.nsslabs.com  

Contact  Information  NSS  Labs,  Inc.  206  Wild  Basin  Rd  Building  A,  Suite  200  Austin,  TX  78746  +1  (512)  961-­‐5300  [email protected]  www.nsslabs.com    

This  and  other  related  documents  available  at:  www.nsslabs.com.  To  receive  a  licensed  copy  or  report  misuse,  please  contact  NSS  Labs  at  +1  (512)  961-­‐5300  or  [email protected]