Upload
jasmine-shepherd
View
214
Download
1
Embed Size (px)
Citation preview
Parallelism for .NET Applications
Vance MorrisonPerformance ArchitectMicrosoft Corporation
Why Go Parallel?
Take Advantage of Multi-Proc Hardware Clock speeds have plateaued Instead transistors used for multiple CPUs Can’t use multi-core unless have multiple threads
Responsiveness GUIs should always be responsive
GUI thread ONLY paints the screen All ‘Real’ work Done on other threads.
Servers should always be responsive Each request should be on a separate ‘thread’
3
Why Going Parallel can be Hard
For all the same Reasons it is for People Many problems are inherently sequential
Complex tasks typically are hard to parallelize High volume, repetitive tasks are your best bet.
Parallel algorithms are always more complicated There is overhead associated with communication
Locks can be expensive, typically the bottleneck in performance Whole new class of subtle errors are possible
Races (Misunderstandings between threads) Deadlock, priority inversion …
Whole new classes of Performance Programs Hot locks, Lock convoys, resource thrashing, Amdahl’s Law
Be Careful Out There!
READ THE FOLLOWINGWhat Every Dev Must Know About Multithreaded Apps
Shared State is the Enemy! Identify tasks that don’t modify shared state Organize tasks so that they return results Understand how to use locks Use read-writer lock for all read-only state
How .NET Can Help: Easy stuff
The .NET Runtime has a MultiThreaded GC Speeds up the GC Today it is not the default, but easy to fix
EXE.Config <gcServer enabled="true" />
A Win if you know you are Multi-Proc Only Speeds up the GC
In well written programs GC is only < 10% of CPU Thus you can only get at most 10% out of this (Amdahl’s law)
How .NET Can Help: ThreadPool System.Threading.ThreadPool Anonymous delegates REALLY handy
7
int loopCount = 10;for (int tId = 1; tId <= 3; tId++) { // Create three work items. ThreadPool.QueueUserWorkItem(delegate { for (int i = 0; i < loopCount; i++) { Console.WriteLine("Thread i={0} time: {1}", i, DateTime.Now); Thread.Sleep(1000); } });}Console.WriteLine("Waiting for workers. Hit return to end program.");Console.ReadLine();
You can capture variablesAnonymous Delegate
ThreadPool vs Threads
Threads are much more expensive To Create Threads have liveness / fairness guarantees
The expectation is that all threads run ‘simultaneously’ ThreadPool work is all about throughput
Work is just ‘exposed parallelism’ No fairness, as long as work machine resources are being used Pool decides how many run ‘simultaneously’ Pool tries to maximize throughput of the entire system.
Value Add Programmer need on expose parallelism as work items System dynamically decides how to exploit that parallism
More Parallelism != Better Perf
9
0 5 10 15 20 25 30 35 40 45 500
20
40
60
80
100
120
140
160
Concurrency Level
Thr
ough
put
Work items with 10% CPU on a dual core system using ThreadPool.QueueUserWorkItem().
Reasons for perf degradation1. Context switches2. Memory contention3. Hot Locks4. Disk Contention5. Network Contention
ThreadPool Is Great, But Need more How do you signal when work is done? How do you abort work? More Synchronization constructs Thread-Safe Data Structures. Parallel Implementation of LINQ
Parallel Extensions for .NET Framework Currently Available for independent download Will be part of next version of framework
How .NET Can Help: PFX
Tasks: A better work item
Task task1 = Task.Create(delegate{ Console.WriteLine("In task 1"); Thread.Sleep(1000); // do work Console.WriteLine("Done task 1");});Task task2 = Task.Create(delegate{ Console.WriteLine("In task 2"); Thread.Sleep(2000); // do work Console.WriteLine("Done task 2");});If (userAbort) { task1.Cancel(); task2.Cancel(); }Task.WaitAll(task1, task2);
Tasks now have handles
Exceptions in tasks propagated on wait
You can cancel work
You can wait on the handle to synchronize
Futures: Tasks with a Value
Future<int> left = Future.Create(delegate{ Thread.Sleep(2000); // do work return 1;});Future<int> right = Future.Create(delegate{ Thread.Sleep(1000); // do work return 1;});
Console.WriteLine("Answer {0}", left.Value + right.Value);
Left and right represent integers, but they may not be computed yet
Asking for the value forces the wait if the future is not already done
In this case the right value was done, so no wait was needed
How .NET Can Help: Parallel LoopsLoops
Loops are a common source of work
Can be parallelized when iterations are independent Body doesn’t depend on mutable state
e.g. static vars, writing to local vars to be used in subsequent iterations
Synchronous All iterations finish, regularly or exceptionally
for(int i = 0; i < n; i++) { work(i);}
foreach(T e in data)
{ work(e);}
Parallel.For(0, n, i => { work(i);}
Parallel.ForEach(data, e =>
{ work(e);}
Parallel LINQ (PLINQ)
Enable LINQ developers to leverage parallel hardware Supports all of the .NET Standard Query Operators
Plus a few other extension methods specific to PLINQ AsParallel, AsSequential WithCancellation, WithOrdering, WithMergeOption, ...
Abstracts away parallelism details Partitions and merges data intelligently
(“classic” data parallelism) Augments LINQ-to-Objects, doesn’t replace it
Must opt-in to parallelism Parallelism blockers
Ordering, concurrent exceptions, thread affinity, over decomposition, side effects
Summary
Multi-core is upon us. No more faster CPUs Need parallelism to exploit Multi-core Parallelism is Not Easy, Do you Homework
What Every Dev Must Know About Multithreaded Apps ThreadPool will do load balancing But needs Parallelism exposed as work items Tasks /Futures are nicer work items Parallel For is even nicer when you can use it PINQ is even nicer when you can use that
What Every Dev Must Know About Multithreaded Apps Microsoft Parallel Extensions to .NET Framework Parallel Computing Developer Center Parallel Programming with .NET (pfxteam’s Blog) Joe Duffy’s Blog Concurrent Programming on Windows (book) Improving .NET Application Performance and Scalability Xperf (Pigs can Fly) Windows performance tools blog Vance Morrison’s Blog Rico Mariani’s Blog
Resources (Keywords for Web Search)
Related SessionsSession Title Speaker Day Time Location
Microsoft Visual Studio: Bringing out the Best in Multicore Systems Hazim Shafi Mon 1:45-3:00PM 502A
TL24 Improving .NET Application Performance and Scalability Steve Carol Wed 1:15-2:30PM 153
Parallel Programming for Managed Developers with the Next Version of Microsoft Visual Studio
Daniel Moth Wed 10:30-11:45AM Petree Hall
Parallel Symposium: Addressing the Hard Problems with Concurrency David Callahan Thurs 8:30-10:00AM 515A
Parallel Symposium: Future of Parallel Computing Dave Detlefs Thurs 12:00-1:30PM 515A
© 2008 Microsoft Corporation. All rights reserved. Microsoft, Windows, Windows Vista and other product names are or may be registered trademarks and/or trademarks in the U.S. and/or other countries.The information herein is for informational purposes only and represents the current view of Microsoft Corporation as of the date of this presentation. Because Microsoft must respond to changing market
conditions, it should not be interpreted to be a commitment on the part of Microsoft, and Microsoft cannot guarantee the accuracy of any information provided after the date of this presentation. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS PRESENTATION.