1MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
““(Ab)(Ab)Using ICANN’s Procedures Using ICANN’s Procedures as a Way to minimize Spam”as a Way to minimize Spam”
Bob Bruen Bob Bruen Garth BruenGarth Bruen
2MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
Standard ApproachesStandard Approaches
• Filter & BlockFilter & Block• Identify SpammersIdentify Spammers• BlacklistBlacklist• Criminal ProsecutionCriminal Prosecution• Civil LitigationCivil Litigation• Challenge/ResponseChallenge/Response• Reputation ProtectionReputation Protection
3MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
Definition: InfrastructureDefinition: InfrastructureThe Front End The Front End
• ICANNICANN
• Top Level RegistrarsTop Level Registrars
• Retail RegistrarsRetail Registrars
• ISPsISPs
• Policies and ProceduresPolicies and Procedures
• Resources CapacityResources Capacity
4MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
Front End ProblemsFront End Problems
• Because of:Because of: Weak procedures Weak procedures Policies not followedPolicies not followed Inadequate resourcesInadequate resources
• Consquences areConsquences are:: Target rich environmentTarget rich environment Spam platform Spam platform Enhances botnets, malware, etcEnhances botnets, malware, etc
5MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
Whois Data Problem Report SystemWhois Data Problem Report SystemWDPRSWDPRS
• WhoisWhois data accuracy data accuracy REQUIREDREQUIRED
• 15 days to fix whois record 15 days to fix whois record
• Created for just these complaintsCreated for just these complaints
• One at a time complaintsOne at a time complaints
• Designed for small numbersDesigned for small numbers
6MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
Modern Complaint ProcessModern Complaint Process
• Match spammers capabilityMatch spammers capability
• Employ large scale operationsEmploy large scale operations
• Automate everythingAutomate everythingProcessing spam submissionsProcessing spam submissionsFiling of complaintsFiling of complaintsFollow ups Follow ups
7MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
KnujOnKnujOn
• Delivers Massively Scalable Delivers Massively Scalable Automated Spam HandlingAutomated Spam Handling
• Strict Use of ICANN Procedures Strict Use of ICANN Procedures Once Detected Once Detected
Front End Spam Prevention Front End Spam Prevention ComplimentsCompliments
Spam Detection & Elimination Spam Detection & Elimination
8MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
What Is DifferentWhat Is Different
• Not a honeypot – real peopleNot a honeypot – real people
• Spam collection spans yearsSpam collection spans years
• Targeting transaction sitesTargeting transaction sites
• Apply ICANN policy enforcementApply ICANN policy enforcement
• Scale of complaints filedScale of complaints filedICANN Report 2006: ~45% was Project KnujOnICANN Report 2006: ~45% was Project KnujOn
9MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
Volume of KnujOn ReportsVolume of KnujOn Reports
2008 anticipated will be 4 times that of 2007
KnujOn Complaint Volume Through ICANN WDPR
0
50,000
100,000
150,000
200,000
250,000
'06 '07 '08 '09
10MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
KnujOn – Key ProcessesKnujOn – Key Processes
• ““Follow the money”Follow the money”• User submitted spam (ftp or email)User submitted spam (ftp or email)• Spam analyzed for Transaction siteSpam analyzed for Transaction site• Whois data acquired & verifiedWhois data acquired & verified• Automated complaint filed if not Automated complaint filed if not
accurateaccurate• Follow up Follow up
11MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
MetaDataMetaData
• Large DatabaseLarge Database
• We can correlate We can correlate Scam sites & individualsScam sites & individuals
Sites & criminal groupsSites & criminal groups
Groups, ISPs, RegistrarsGroups, ISPs, Registrars
• Analyze trends Analyze trends
12MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
Scale ProblemScale Problem
• 50,000,000 Registrations in 200750,000,000 Registrations in 2007
• 50,000 Complaints - Apparent Limit50,000 Complaints - Apparent Limit
• Off by three orders of magnitudeOff by three orders of magnitude
• Shutdown 55,000+ (PoC)Shutdown 55,000+ (PoC)
• 20,000-25,000/day submissions 20,000-25,000/day submissions
13MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
““Big” Problem Actually SmallBig” Problem Actually Small
93% of Complaints at 10 Registrars
10 Registrars
All other registrars
14MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
Repairing the InfrastructureRepairing the Infrastructure
• Evaluate registrar servicesEvaluate registrar services• Rate registrarsRate registrars• Rate ISPsRate ISPs• Challenge Privacy ProtectionChallenge Privacy Protection• Test Whois ServicesTest Whois Services• Identifying Fake DNS serversIdentifying Fake DNS servers
15MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
Registrar EvaluationRegistrar Evaluation
• Number of complaints Number of complaints
• Filed & totalFiled & total
• Acknowledgment/timelinessAcknowledgment/timeliness
• Action taken Action taken
• Rot daysRot days
• EngagedEngaged
16MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
Rot DaysRot Days
• ““Rot days” = Suspend date – file date Rot days” = Suspend date – file date • Should be shorter than:Should be shorter than:
Tasting days = 5 days (Add Grace Period)Tasting days = 5 days (Add Grace Period)Average life time = 5 days (UCSD paper)Average life time = 5 days (UCSD paper)
• Unfortunately increasingUnfortunately increasing
17MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
0 10 20 30 40
Days to suspension
Optimum (on discovery) "Tasting period"
Real time to suspension
Rot DaysRot Days
18MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
Sample Registrar RatingSample Registrar RatingCaveatsCaveats
• Only uses our filed complaintsOnly uses our filed complaints
• Relative ratings matterRelative ratings matter
• Small sample n = 9 (~1000 registrars)Small sample n = 9 (~1000 registrars)
• Better & worse registrars existBetter & worse registrars exist
• Only .com numbersOnly .com numbers
19MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
Example Rating TableExample Rating Table
0.08%12,03615,295,392GODADDY
0.09%713788,110nameking
0.17%7,6464,552,986TUCOWS
0.29%594206,593Markmon
0.31%15,3975,046,746NETSOL
0.36%815223,728BIZCN
0.64%39,6096,179,440ENOM
0.86%9,2011,064,697directnfo
1.53%29,8551,956,780MONIKER
Sorted by Rate – Smaller is better
Complaints rateComplaints Filed
Total DomainsRegistrar
20MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
GoalsGoals
• Fix the WDPRSFix the WDPRS
• Enforce the rulesEnforce the rules
• Audit the RegistrarsAudit the Registrars
• Terminate the bad Terminate the bad registrarsregistrars
21MIT Spam ConferenceMIT Spam Conference March 27-28, 2008March 27-28, 2008
Thank YouThank You
Bob Bruen Bob Bruen [email protected]@coldrain.nethttp://www.coldrain.nethttp://www.coldrain.net
Garth BruenGarth [email protected]@coldrain.nethttp://www.knujon.comhttp://www.knujon.com