In Part I, we discussed some of the motivations behind Swift and a little bit of the history that led up to its release. In this article, we'll talk about the different environments available for writing Swift. Then, we'll go behind the scenes and look at how Swift code is transformed into something that can run on different processors (e.g. x86-64 for OS X and ARM for iOS).
If you're looking to build an OS X or iOS application with Swift, you will probably end up spending most of your time in Apple's Xcode application. Like other development environments (e.g. Eclipse, Visual Studio, etc.), Xcode includes many features to help you be more productive. However, Xcode has its own learning curve and hides many of the interesting things that are going on under the covers. A tool that shields you from the internal details can be a good thing when you're trying to get work done. However, when learning a new language like Swift, it is a worthwhile exercise to step away from the big fancy tool and get your hands a little dirty.
The folks that work on the official Swift blog have put together a short video titled Building Your First Swift App Video. This video demonstrates building a simple iOS app with Swift and Xcode.
Another option that is available for writing Swift code is to create a Swift Playground (feature included with Xcode 6). A Playground provides a place where you can write Swift code and have it executed immediately as you type it. This type of environment is great for exploring the language and trying out pieces of code before you put them into an actual application.
There are a huge number of tutorials available on the Web about creating and using a Swift Playground. Simple type "swift playground" into your favorite search engine and take your pick. There is also a great video from Apple about Playground here.
If you want to go a little more low-level or if you're one of those people that just feels more at home on the command-line, you can fire up the Swift REPL (also installed as part of Xcode 6). The Swift REPL is a command-line tool that provides an interactive Read Eval Print Loop. What that means is that each line of code is executed as you type it. Sound familiar? A Swift Playground is really just a pretty wrapper around the Swift REPL.
For a great introduction to the Swift REPL, go check out another article from our friends over on the official Apple Swift blog.
The time has now come when we'll look to answer some questions that are often asked but rarely answered in blog articles and tutorials about Swift.
At a high-level, the job of any compiler is to take source code and turn it into something that can run on a processor (machine code). As part of that process, the source code is typically turned into an intermediate form, optimized, and then transformed into machine code. Those jobs can be split up into three separate components – the frontend, the optimizer, and the backend.
In Part I of this series, I talked about some of the history that led up to the development of Swift. During that discussion, the LLVM Compiler Infrastructure was mentioned. LLVM is not a compiler in the traditional sense. It's more appropriate to think of it as scaffolding that can be used for building actual compilers (hence the "infrastructure" part of the name). The result is a system that can support multiple languages and multiple hardware platforms without each compiler having to reinvent the wheel.
For this design to work, each frontend must translate the original source code into a common language that can be consumed by the optimizer. LLVM defines a new language for this purpose called LLVM Intermediate Representation (IR). LLVM IR is not something that can be packaged for deployment like Java bytecode or .NET IL. It is simply a step taken by the compiler on the road from source code to machine code. In the case of Swift, your code actually ends up getting transformed into two other formats before LLVM IR. Here is the whole picture for Swift:
This is all very interesting but still very abstract. Let's try to improve that situation by examining an actual Swift program during each step of the process.
For our adventure into the Swift compiler, we'll simply need a text editor and the trusty command line.
let a = 3 let b = 4 let c = a + b println(c)
swiftc –help
If you look at some of the "modes" that are listed near the top of the output, you'll see options that correspond to what's shown in Figure 3.
-dump-ast Parse and ... dump AST(s) -emit-sil Emit canonical SIL file(s) -emit-ir Emit LLVM IR file(s) -emit-assemblyEmit assembly file(s)
Before we use these modes to examine the individual steps, we can use a mode that will combine all of the steps and build a working executable.
swiftc –emit-executable onion.swift
This will create a new executable file in the same directory.
./onion
Great! We can, in fact, eat the onion that we created. Now, we can start peeling back the layers.
The first layer we'll look at is the abstract syntax tree (AST). This is the first step for most compilers – to convert the source code from a form that is "human friendly" into a syntax tree that is easier to work with programmatically.
Run the following command to output the AST for our simple Swift application:
swiftc –dump-ast onion.swift
The AST isn't the prettiest to look at, but it isn't meant to be. However, with this simple program, you can probably follow what's there and maybe even pick up a few interesting nuggets. One nugget we'll focus on right now is the content that corresponds to the addition operation (let c = a + b).
(declref_expr type='(Int, Int) -> Int' location=onion.swift:3:11 range=[onion.swift:3:11 – line:3:11] decl=Swift.(file).+ specialized=no)
Nothing really special here – it's an expression that takes two Swift integers and returns a Swift integer.
The next layer of our onion is the Swift Intermediate Language that is generated from the AST. This intermediate form is used by the Swift optimizer to perform Swift language specific optimizations prior to generating the LLVM IR.
To view the SIL for our Swift application, execute the following command:
swiftc –emit-sil onion.swift
Well, this is even uglier to look at but that's sort of expected since we're moving closer to the machine code representation of our program. Once again, let's see if we can spot the code that corresponds to "let c = a + b".
%11 = builtin_function_ref "sadd_with_overflow_Word" : [email protected] (Builtin.Word, Builtin.Word, Builtin.Int1) -> (Builtin.Word, Builtin.Int1)
What's interesting here is that the addition operation is being performed through the use of a "builtin" function named "sadd_with_overflow_Word". Builtin functions are special since they are available only to the Swift standard library and typically map directly to LLVM IR instructions.
The use of "Word" as the data type reflects the fact that the Int data type in Swift corresponds to the native word size of the target architecture (32 or 64 bits).
It's time for the next layer of the onion. This is the point where the SIL is translated into LLVM IR. This is the representation that the LLVM optimizer can work with to perform language-independent optimizations such as inlining and loop optimizations.
To peel back this layer, execute the following command to view the LLVM IR for our Swift program:
swiftc –emit-ir onion.swift
Now things are really starting to look low-level. Once again, let's see if we can spot our addition operation.
%3 = call { i64, i1 } @llvm.sadd.with.overflow.i64(i64 %1, i64 %2)
So, now our addition operation has been transformed into a call to the llvm.sadd.with.overflow.i64 instruction. One thing that's interesting here is that if we were compiling a different language (e.g. Objective-C), we might very see the same LLVM IR.
If learning more about LLVM IR is something you're really into, you're in luck. LLVM is extensively documented here. If you want to invent your own language and use LLVM, you'll need to build a frontend that can generate LLVM IR. The good news – that's all you'll have to do. You'll get the LLVM optimizer and backend code generators for free.
If you've been able to follow along up until now – congratulations. We're now at the last layer of the onion. It's time to see the platform-specific assembly language for our Swift program.
To view the assembly language that's created from the LLVM IR, execute the following command:
swiftc –emit-assembly onion.swift
There is some very good news contained within this output. Specifically, we see that our high-level Swift code that adds two Swift integers does in fact translate into the proper (and very efficient) x86-64 addq instruction for adding two 64-bit integers.
Admittedly, we have been examining a very simple case (adding two integers). However, you can use this same procedure to examine every layer of the onion for any piece of Swift code.
One important caveat is that we have been performing all of these operations with optimizations enabled. If you decide to perform some additional experiments with more complex code, you can include the –O option when executing the commands in this article to enable optimizations so that you can see output closer to what you would receive when compiling a project with Xcode in release mode.
In this article, we've looked at how the Swift code you write gets transformed into platform-specific machine code. There are times when Swift code can be interpreted, as we see with Playgrounds and the Swift REPL. However, when you build a project with Xcode, your code is compiled into a native executable and the compiler does a pretty good job. In fact, in many cases, your Swift code will execute faster than the equivalent Objective-C code.
As the saying goes though – garbage in, garbage out. The Swift and LLVM optimizers will do the best they can with your code but they can't determine your intent if you don't express it clearly in Swift. In Part III, we’ll look at a syntax-level feature of Swift that causes a lot of developers trouble – optionals.
Accelebrate offers private Swift training and iOS training and for groups and instructor-led online Swift classes for individuals.
Written by Jason Bell
Jason is Accelebrate's Director of Technology where he ensures the smooth and effective running of Accelebrate's IT infrastructure to support the organization's continued growth. Jason also is a trainer specializing in Computer software design and development, C/C++, Microsoft .NET, Apple OS X, iOS (Objective-C and Swift), IIS, and more.
We offer private, customized training for 3 or more people at your site or online.
Our live, instructor-led lectures are far more effective than pre-recorded classes
If your team is not 100% satisfied with your training, we do what's necessary to make it right
Whether you are at home or in the office, we make learning interactive and engaging
We accept check, ACH/EFT, major credit cards, and most purchase orders
Alabama
Birmingham
Huntsville
Montgomery
Alaska
Anchorage
Arizona
Phoenix
Tucson
Arkansas
Fayetteville
Little Rock
California
Los Angeles
Oakland
Orange County
Sacramento
San Diego
San Francisco
San Jose
Colorado
Boulder
Colorado Springs
Denver
Connecticut
Hartford
DC
Washington
Florida
Fort Lauderdale
Jacksonville
Miami
Orlando
Tampa
Georgia
Atlanta
Augusta
Savannah
Hawaii
Honolulu
Idaho
Boise
Illinois
Chicago
Indiana
Indianapolis
Iowa
Cedar Rapids
Des Moines
Kansas
Wichita
Kentucky
Lexington
Louisville
Louisiana
New Orleans
Maine
Portland
Maryland
Annapolis
Baltimore
Frederick
Hagerstown
Massachusetts
Boston
Cambridge
Springfield
Michigan
Ann Arbor
Detroit
Grand Rapids
Minnesota
Minneapolis
Saint Paul
Mississippi
Jackson
Missouri
Kansas City
St. Louis
Nebraska
Lincoln
Omaha
Nevada
Las Vegas
Reno
New Jersey
Princeton
New Mexico
Albuquerque
New York
Albany
Buffalo
New York City
White Plains
North Carolina
Charlotte
Durham
Raleigh
Ohio
Akron
Canton
Cincinnati
Cleveland
Columbus
Dayton
Oklahoma
Oklahoma City
Tulsa
Oregon
Portland
Pennsylvania
Philadelphia
Pittsburgh
Rhode Island
Providence
South Carolina
Charleston
Columbia
Greenville
Tennessee
Knoxville
Memphis
Nashville
Texas
Austin
Dallas
El Paso
Houston
San Antonio
Utah
Salt Lake City
Virginia
Alexandria
Arlington
Norfolk
Richmond
Washington
Seattle
Tacoma
West Virginia
Charleston
Wisconsin
Madison
Milwaukee
Alberta
Calgary
Edmonton
British Columbia
Vancouver
Manitoba
Winnipeg
Nova Scotia
Halifax
Ontario
Ottawa
Toronto
Quebec
Montreal
Puerto Rico
San Juan