It took a couple LLM prompts plus a few simple syntax tweaks from my side and now DOOM on the Apple Watch has audio. 🤯 It was almost as easy as typing iddqd
. To get around lack of SDL support, AVFoundation is used. Check out the GitHub commits.
Category: AI
Over the past couple of years (2022, 2023) I’ve compared the output of the following prompt to ChatGPT:
Draw a Christmas tree using SwiftUI
For 2024 I chose the GPT-4o model, and this is what it output:
import SwiftUI
struct ChristmasTreeView: View {
var body: some View {
ZStack {
// Background color
Color.blue
.edgesIgnoringSafeArea(.all)
// Tree
VStack(spacing: -20) {
Triangle()
.fill(Color.green)
.frame(width: 150, height: 100)
Triangle()
.fill(Color.green)
.frame(width: 180, height: 120)
Triangle()
.fill(Color.green)
.frame(width: 210, height: 140)
// Tree trunk
Rectangle()
.fill(Color.brown)
.frame(width: 40, height: 60)
}
.offset(y: 50)
// Star on top
StarShape()
.fill(Color.yellow)
.frame(width: 40, height: 40)
.offset(y: -170)
// Decorations (Ornaments)
Circle()
.fill(Color.red)
.frame(width: 15, height: 15)
.offset(x: -40, y: -60)
Circle()
.fill(Color.yellow)
.frame(width: 15, height: 15)
.offset(x: 50, y: -30)
Circle()
.fill(Color.white)
.frame(width: 15, height: 15)
.offset(x: -30, y: 20)
Circle()
.fill(Color.blue)
.frame(width: 15, height: 15)
.offset(x: 40, y: 60)
}
}
}
// Triangle shape for the tree
struct Triangle: Shape {
func path(in rect: CGRect) -> Path {
Path { path in
path.move(to: CGPoint(x: rect.midX, y: rect.minY))
path.addLine(to: CGPoint(x: rect.minX, y: rect.maxY))
path.addLine(to: CGPoint(x: rect.maxX, y: rect.maxY))
path.closeSubpath()
}
}
}
// Star shape for the top of the tree
struct StarShape: Shape {
func path(in rect: CGRect) -> Path {
let points = [
CGPoint(x: 0.5, y: 0),
CGPoint(x: 0.61, y: 0.35),
CGPoint(x: 1, y: 0.35),
CGPoint(x: 0.68, y: 0.57),
CGPoint(x: 0.79, y: 0.91),
CGPoint(x: 0.5, y: 0.7),
CGPoint(x: 0.21, y: 0.91),
CGPoint(x: 0.32, y: 0.57),
CGPoint(x: 0, y: 0.35),
CGPoint(x: 0.39, y: 0.35)
].map { CGPoint(x: $0.x * rect.width, y: $0.y * rect.height) }
return Path { path in
path.move(to: points[0])
for point in points.dropFirst() {
path.addLine(to: point)
}
path.closeSubpath()
}
}
}
struct ChristmasTreeView_Previews: PreviewProvider {
static var previews: some View {
ChristmasTreeView()
}
}
Which generated:




This is the best one yet! I didn’t have to add or change any code to make it work, it added ornaments and even a (floating 😀) star! 🎄
Revisiting last year’s challenge, let’s see how this year goes with GPT-4.
I used the same prompt from 2022:
Draw a Christmas tree using SwiftUI
Interestingly this time it included the Triangle
type and it was almost line-for-line the exact one I substituted in last year.
import SwiftUI
struct ContentView: View {
var body: some View {
ZStack {
// Background
Color.white.edgesIgnoringSafeArea(.all)
// Christmas Tree
VStack {
ForEach((1...5).reversed(), id: \.self) { i in
TreeLayer(width: CGFloat(i) * 60)
}
Rectangle()
.frame(width: 40, height: 50)
.foregroundColor(.brown)
}
}
}
}
struct TreeLayer: View {
var width: CGFloat
var body: some View {
ZStack {
Triangle()
.fill(Color.green)
.frame(width: width, height: width * 0.75)
// Add decorations here
}
}
}
struct Triangle: Shape {
func path(in rect: CGRect) -> Path {
var path = Path()
path.move(to: CGPoint(x: rect.midX, y: rect.minY))
path.addLine(to: CGPoint(x: rect.minX, y: rect.maxY))
path.addLine(to: CGPoint(x: rect.maxX, y: rect.maxY))
path.closeSubpath()
return path
}
}
struct ContentView_Previews: PreviewProvider {
static var previews: some View {
ContentView()
}
}
And here’s the output for 2022 and 2023! 🎄😀


Thanks to Whisper and this awesome port, the tree is responding to spoken words. 🗣🎄
Since the tree itself only has a low-powered MCU, we need another machine to act as a listener.
The architecture is:
- A machine in my office runs the Whisper model and listens for words.
- If certain keywords are found it finds a corresponding command to run (e.g. do a theater chase sequence in a green color).
- It sends that command to the tree over the network.
For now I’m running it from iOS and macOS, so I wrote the current implementation in Swift. The code is currently still in “hack” status, but working well!
Now it’s time to test it when talking to coworkers at Automattic.
Here’s the prompt I gave ChatGPT:
Draw a Christmas tree using SwiftUI
Here’s the code I got:
struct ChristmasTree: View {
var body: some View {
ZStack {
// draw the tree trunk
Rectangle()
.fill(Color.brown)
.frame(width: 20, height: 100)
// draw the tree branches
ForEach(0..<5) { i in
HStack(spacing: 5) {
ForEach(0..<5-i) { j in
Triangle()
.fill(Color.green)
.frame(width: 10, height: 10)
}
}
.offset(x: 0, y: -10 * CGFloat(i))
}
// draw the star on top of the tree
Circle()
.fill(Color.yellow)
.frame(width: 10, height: 10)
.offset(x: 0, y: -50)
}
}
}
The only major issue syntax-wise is that Triangle() isn’t defined – where did it get that?
A quick visit to Hacking with Swift had this nice snippet, which I dropped in.
struct Triangle: Shape {
func path(in rect: CGRect) -> Path {
var path = Path()
path.move(to: CGPoint(x: rect.midX, y: rect.minY))
path.addLine(to: CGPoint(x: rect.minX, y: rect.maxY))
path.addLine(to: CGPoint(x: rect.maxX, y: rect.maxY))
path.addLine(to: CGPoint(x: rect.midX, y: rect.minY))
return path
}
}
Now we can compile and…. tada!

Not great, but darn impressive! 🎄🤯