You use it when you want to execute a block and wait for the results.
You use it when you want to execute a block and wait for the results. One example of this is the pattern where you're using a dispatch queue instead of locks for synchronization. For example, assume you have a shared NSMutableArray a, with access mediated by dispatch queue q.
A background thread might be appending to the array (async), while your foreground thread is pulling the first item off (synchronously): NSMutableArray *a = NSMutableArray alloc init; // All access to `a` is via this dispatch queue! Dispatch_queue_t q = dispatch_queue_create("com.foo. Samplequeue", NULL); dispatch_async(q, ^{a addObject:something}; // append to array, non-blocking __block Something *first = nil; // "__block" to make results from block available dispatch_sync(q, ^{ // note that these 3 statements... if (a count > 0) { // ...are all executed together... first = a objectAtIndex:0; // ...as part of a single block... a removeObjectAtIndex:0; // ...to ensure consistent results } }).
I'll +1 this, since it's technically correct, although I there isn't much value in doing a single dispatch_async followed by a dispatch_sync on the same queue. However this same pattern is useful when you want to spawn multiple concurrent jobs on another queue and then wait for them all. – kperryua Jan 5 at 19:57 Thanks.
This is beginning to make sense. What if I want to start multiple concurrent threads using dispatch_apply that access a single resource with mutual exclusion. How do I do this with GCD?
Is the only way is to use a dispatch_async with serial queue within my dispatch_apply? Is there a way to use dispatch_sync? – Rasputin Jones Jan 5 at 20:56 @kperryua - sorry if the example wasn't clear - the idea is that the a separate thread would be doing multiple dispatch_async's to the queue – David Gelhar Jan 6 at 1:20 @David Gelhar - No problem.
Just making mention for others who come looking. – kperryua Jan 6 at 6:01 2 I also like to think of it as being similar to using -performSelector:onThread:withObject:waitUntilDone: or performSelectorOnMainThread:withObject:waitUntilDone: and setting waitUntilDone to YES. – Brad Larson Jan 6 at 18:42.
Dispatch_sync is semantically equivalent to a traditional mutex lock. Dispatch_sync(queue, ^{ //access shared resource }); works the same as pthread_mutex_lock(&lock); //access shared resource pthread_mutex_unlock(&lock).
Here's a half-way realistic example. You have 2000 zip files that you want to analyze in parallel. But the zip library isn't thread-safe.
Therefore, all work that touches the zip library goes into the unzipQueue queue. (The example is in Ruby, but all calls map directly to the C library. "apply", for example, maps to dispatch_apply(3)) #!
/usr/bin/env macruby -w require 'rubygems' require 'zip/zipfilesystem' @unzipQueue = Dispatch::Queue. New('ch.unibe.niko. UnzipQueue') def extractFile(n) @unzipQueue.
Sync do Zip::ZipFile. Open("Quelltext. Zip") { |zipfile| sourceCode = zipfile.file.
Read("graph. Php") } end end Dispatch::Queue.concurrent. Apply(2000) do |i| puts I if I % 200 == 0 extractFile(i) end.
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.