BitByBit

Introduction to Foundation Models

The Foundation Models framework lets you tap into the on-device large models at the core of Apple Intelligence. You can enhance your app by using generative models to create content or perform tasks. The framework supports language understanding and generation based on models capabilities like text extraction and summarization that you can use to:

  • Generate a title, description, or tags for content
  • Generate a list of search suggestions relevant to your app
  • Transform product reviews into structured data you can visualize
  • Invoke your own tools to assist the model with performing app-specific tasks

Note: It can take some time for the model to download and become available when a person turns on Apple Intelligence. Models may not be available if the device is in low battery mode or it becomes too warm.

(source: Generating content and performing tasks with Foundation Models)

Our app BitByBit will help break down any task into simple, clear steps. For example, it can turn “making homemade bread” into a concise checklist of actions. It will look like this:

Let’s get started!

Step 1: Download Xcode 26

For this project you’ll need Xcode 26 beta 2. You can download it here:
Xcode 26 beta 2

Please install it before proceeding.

Step 1: Set up your project

  1. Open Xcode: Launch Xcode and select Create a new Xcode project.
  2. Choose Template: Select App under the iOS tab and click Next.
  3. Name Your Project: Enter a name for your project, like BitByBit.
    • interface: SwiftUI
    • language: Swift

Click Next, and then save your project.

When you open your project, you’ll see the already familiar standard code presenting a globe and the Text “Hello, world!” in the ContentView.swift.

Step 2: Verify Availability

Before integrating Foundation Models, we must check for model availability. In ContentView.swift, do the following:

At the top of the file:

Swift
import SwiftUI
import FoundationModels

Add a property to reference the system model:

Swift
private var model = SystemLanguageModel.default

Now replace the body with this:

Swift
switch model.availability {
	case .available:
	    TaskView()
    case .unavailable(.deviceNotEligible):
	    Text("Device not eligible")
    case .unavailable(.appleIntelligenceNotEnabled):
		Text("Please enable Apple Intelligence in your device settings.")
    case .unavailable(.modelNotReady):
	    Text("The model isn't ready because it's downloading or because of other system reasons.")
    case .unavailable(_):
	    Text("The model is unavailable for an unknown reason.")
}

This ensures a fallback experience if the model isn’t available, and will show the TaskView if it is available. Here, the user can type the request.

Let’s create the TaskView!

Step 3: Creating TaskView

It will be a very simple view: just a headline, a textfield for the request, a slider for the temperature, a button to start running the request and a scrollview for showing the result.

Please create a new SwiftUI view called TaskView and define the following properties:

Swift
@State private var userInput = ""
@State private var isLoading = false
@State private var result: String? = nil
@State private var temperature: Double = 1.0

include in the body – replacing the default code – the following:

Swift
VStack(spacing: 20) {
	Text("Bit by Bit Companion")
		.font(.largeTitle.bold())
		.multilineTextAlignment(.center)

	TextField("Your topic...", text: $userInput)
		.textFieldStyle(.roundedBorder)
		.padding(.horizontal)

	VStack(alignment: .leading) {
		HStack {
			Text("Temperature: ")
			Spacer()
			Text(String(format: "%.2f", temperature))
		}
		Slider(value: $temperature, in: 0.0...2.0)
	}
	.padding(.horizontal)

	Button {
		// action part
		// more to come - the function to receive the system language model result
	} label: {
		Label("Ask", systemImage: "arrow.forward.circle.fill")
			.font(.title2)
			.padding()
	}
	.disabled(userInput.isEmpty)

	if isLoading {
		ProgressView()
	}
	// if we got a result
	if let result = result {
		ScrollView {
			VStack(alignment: .leading, spacing: 10) {
				Text(result)
					.textSelection(.enabled)
			}
			.padding()
		}
		.padding()
	}
	Spacer()
}
.padding()

Please note that the definition of the slider is exactly as before – but here in iOS 26 it will beautifully show up with the liquid glass style.

Now, let’s create a PromptManager that has a method fetchBitByBit to fetch the results:

Step 4: Creating PromptManager

Create a new Swift file called PromptManager. Create a method called fetchBitByBit. This function has 2 inputs: the user’s prompt (request) and a temperature which influences the confidence of the model’s response.

Temperature is an adjustment applied to the probability distribution prior to sampling. A value of 1 results in no adjustment. Values less than 1 will make the probability distribution sharper, with already likely tokens becoming even more likely. Values greater than 1 will flatten the distribution, making less probable tokens more likely.

The net effect is that low temperatures manifest as more stable and predictable responses, while high temperatures give the model more creative license, i.e.

  • Lower temperature (e.g., 0.5): more predictable, stable results.
  • Higher temperature (e.g., 1.5): more creative, diverse responses.
Swift
import SwiftUI
import FoundationModels

struct PromptManager {
    static func fetchBitByBit(for userPrompt: String, temperature: Double = 1.0) async throws -> String {

        
    }
}

This function fetchBitByBitis marked as async, meaning it’s designed to run asynchronously. This allows it to suspend execution while waiting for a result (in this case, the model’s response) without blocking the main thread.

Why it matters:

  • SwiftUI apps run on the main thread to manage UI updates.
  • If you run a long or slow operation (like querying a language model) on the main thread, your UI would freeze.
  • By using async, Swift can perform the long-running task in the background and return the result when it’s ready.

Since we have already confirmed that the model is available, we can create a new session as a LanguageModelSession. Here we can pass on some instructions. Instructions define the model’s intended behavior on prompts. Since we want to provide a break-down of a topic to complete the provided task, we use the following instructions within the function fetchBitByBit:

Swift
let instructions = """
            Suggest a maximum of ten main steps to the topic provided to complete the task. Keep them concise (three to seven words) and make sure they build naturally from the person's topic.
            """

and finally, we create the session:

Swift
let options = GenerationOptions(temperature: temperature)
let session = LanguageModelSession(instructions: instructions)

We can now fetch the result to the user`s request by simply asking the session to respont to the user’s request:

Swift
let response = try await session.respond(to: userPrompt, options: options)
return response.content
  • The await keyword pauses the function at this line until the session.respond(…) finishes.
  • But importantly: other tasks (like rendering UI or handling user input) continue running in the meantime.
  • Once the model returns a response, execution resumes and the function continues.

That’s already all to get the result! The only step left, is to trigger this request via the button we created in the TaskView.

Step 5: Trigger the Model Fetch

Within the action part of the button, we can now fetch the result of the request:

Swift
Task {
	isLoading = true
	
	let response = try? await PromptManager.fetchBitByBit(for: userInput, temperature: temperature)

	result = response ?? "No answer"
	isLoading = false
}

Task { ... } creates a new structured concurrent task. This task runs outside the main actor by default (unless otherwise specified), allowing async code without blocking the view. Swift automatically handles suspension and resumption of this task when it hits await.

That’s all!

Step 6: Run your app

You see already a preview of your app on the right-hand side in the canvas. To run your app, it’s best to use a physical device – and press the Play button or simply use Cmd + R.

Optional Step: Fine-tuning the Prompt

Since it is a little bit cumbersome to fine-tune the prompt if you always have to compile the app and run it, Apple integrated the functionality in Playground. If you want to experiment with the prompt, create a new file e.g. PromptTweak. Take care that the canvas is shown. Then include:

Swift
import Playgrounds
import FoundationModels

#Playground {

    let session = LanguageModelSession(
        instructions: """
        Suggest a maximum of ten main steps to the topic provided to complete the task. Keep them concise (three to seven words) and make sure they build naturally from the person's topic.
        """
    )

	let prompt = "Making homemade bread"

    let options = GenerationOptions(temperature: 1.0)
    let response = try await session.respond(
        to: prompt,
        options: options
    )
}

On the right-hand side in the canvas part, you can see the response to your instructions and prompt. It makes it easier to fine-tune the instructions and prompts without the need to always run the entire app.

Congratulations!

You’ve successfully built your first app using the new Foundation Models Framework! 🎉

Try some different requests like

  • make homemade bread
  • build a meditation habit

Have fun when experimenting!

What you have learned

In this code-along, you’ve learned how to:

  • 💡 Access Apple’s new Foundation Models using the FoundationModels framework
  • 🧠 Check model availability with SystemLanguageModel.defaultand handle fallback scenarios
  • 🧵 Run asynchronous tasks safely using async/await and Task {} in SwiftUI
  • 🔄 Create structured prompts and pass them to a LanguageModelSession for generative output
  • 🌡️ Use temperature settings to fine-tune the creativity and predictability of the model’s responses
  • 🧰 Build a reusable helper (PromptManager) that encapsulates the prompt logic cleanly
  • 🧪 Experiment with prompts in Playgrounds to fine-tune instructions before integrating them into the app
  • 📱 Design a responsive SwiftUI interface that updates live with loading indicators and user feedback

That’s a lot to cover! You made it this far, very well done! 🎉

That’s a wrap!

Keep learning, keep building, and let your curiosity guide you. Happy coding! ✨

The best developers aren’t the ones who never struggle, they are the ones who never quit. — Paul Hudson


Download the full project on GitHub: https://github.com/swiftandcurious/BitByBit