Categories
AI SwiftUI

ChatGPT SwiftUI Christmas Tree 2024

Over the past couple of years (2022, 2023) I’ve compared the output of the following prompt to ChatGPT:

Draw a Christmas tree using SwiftUI

For 2024 I chose the GPT-4o model, and this is what it output:

import SwiftUI

struct ChristmasTreeView: View {
    var body: some View {
        ZStack {
            // Background color
            Color.blue
                .edgesIgnoringSafeArea(.all)
            
            // Tree
            VStack(spacing: -20) {
                Triangle()
                    .fill(Color.green)
                    .frame(width: 150, height: 100)
                Triangle()
                    .fill(Color.green)
                    .frame(width: 180, height: 120)
                Triangle()
                    .fill(Color.green)
                    .frame(width: 210, height: 140)
                
                // Tree trunk
                Rectangle()
                    .fill(Color.brown)
                    .frame(width: 40, height: 60)
            }
            .offset(y: 50)
            
            // Star on top
            StarShape()
                .fill(Color.yellow)
                .frame(width: 40, height: 40)
                .offset(y: -170)
            
            // Decorations (Ornaments)
            Circle()
                .fill(Color.red)
                .frame(width: 15, height: 15)
                .offset(x: -40, y: -60)
            Circle()
                .fill(Color.yellow)
                .frame(width: 15, height: 15)
                .offset(x: 50, y: -30)
            Circle()
                .fill(Color.white)
                .frame(width: 15, height: 15)
                .offset(x: -30, y: 20)
            Circle()
                .fill(Color.blue)
                .frame(width: 15, height: 15)
                .offset(x: 40, y: 60)
        }
    }
}

// Triangle shape for the tree
struct Triangle: Shape {
    func path(in rect: CGRect) -> Path {
        Path { path in
            path.move(to: CGPoint(x: rect.midX, y: rect.minY))
            path.addLine(to: CGPoint(x: rect.minX, y: rect.maxY))
            path.addLine(to: CGPoint(x: rect.maxX, y: rect.maxY))
            path.closeSubpath()
        }
    }
}

// Star shape for the top of the tree
struct StarShape: Shape {
    func path(in rect: CGRect) -> Path {
        let points = [
            CGPoint(x: 0.5, y: 0),
            CGPoint(x: 0.61, y: 0.35),
            CGPoint(x: 1, y: 0.35),
            CGPoint(x: 0.68, y: 0.57),
            CGPoint(x: 0.79, y: 0.91),
            CGPoint(x: 0.5, y: 0.7),
            CGPoint(x: 0.21, y: 0.91),
            CGPoint(x: 0.32, y: 0.57),
            CGPoint(x: 0, y: 0.35),
            CGPoint(x: 0.39, y: 0.35)
        ].map { CGPoint(x: $0.x * rect.width, y: $0.y * rect.height) }
        
        return Path { path in
            path.move(to: points[0])
            for point in points.dropFirst() {
                path.addLine(to: point)
            }
            path.closeSubpath()
        }
    }
}

struct ChristmasTreeView_Previews: PreviewProvider {
    static var previews: some View {
        ChristmasTreeView()
    }
}

Which generated:

This is the best one yet! I didn’t have to add or change any code to make it work, it added ornaments and even a (floating 😀) star! 🎄

Categories
Linux Tutorials

LVM to the rescue

Root problem

I’ve been running Debian 12 for a while on my Linux desktop and recently ran into a big issue: My root volume ran out of space. Never good.

In the past I would’ve taken the path of least discomfort (for me): Clean install, repartition differently, start fresh. Because I’d used LVM when installing, however, I had another option this time:

On small systems (like a desktop), instead of having to estimate at installation time how big a partition might need to be, LVM allows filesystems to be easily resized as needed.

Wikipedia

Goal

I want to expand the root volume by shrinking the home volume a little bit. Let’s see how easy it really is with LVM!

Prerequisites

My environment was an x64 box running Debian 12 containing a single 1TB SSD. I didn’t add any physical drives, I only adjusted the space allotted to the LVM volumes.

Live image

I used a live install image to boot Linux without installing anything. After downloading the image I used dd to copy it to a USB flash drive. When booting on Debian, you’ll need to select the “Rescue mode” and follow the menus to get to a shell. When asked about using a root filesystem, tell it not to mount anything.

Figure out paths and numbers

To make this tutorial generic, we’re going to define four variables. Two for the volume paths, one for the amount we’re going to grow the smaller volume, and one for the amount we’re going to shrink the bigger volume.

Paths

There are plenty of ways to figure out the paths to the volumes you’ll be using, but lvdisplay works well.

shrinkPath = path of volume to want to shrink
growPath = path of volume to want to grow

Numbers

We’ll also need to define the growth amount of the growing volume and the new size of the shrinking volume.

growthAmount = how much we want the volume at growPath to grow
newShrinkSize = current size of the volume at shrinkPathgrowthAmount

Example

We have a root volume we want to grow by 30GB as well as a home volume (currently 900GB) that we need to shrink.

shrinkPath = /dev/debian-vg/home
growPath = /dev/debian-vg/root
growthAmount = 30G
newShrinkSize = 870G (which is 900-30)

Note: Gigabyte units are assumed in this example, hence the G after the 30 and 870.

Steps

⚠️ Complete these at your own risk! Any time you change filesystems you run the risk of data loss, so backup your stuff!

  1. Boot into a recovery mode without mounting any filesystems.
  2. e2fsck -f {shrinkPath} to check for errors.
    • Example: e2fsck -f /dev/debian-vg/home
  3. resize2fs {shrinkPath} {newShrinkSize} to shrink the filesystem.
    • Example: resize2fs /dev/debian-vg/home 870G
    • Important: Be sure to include the appropriate size unit after the new shrink size, like G for gigabytes. See the resize2fs man page.
  4. lvreduce -L -{growthAmount} {shrinkPath} to shrink the LVM volume.
    • Example: lvreduce -L -30G /dev/debian-vg/home
    • Note: Some tutorials accomplish steps 4 and 5 in one go via lvreduce --resizefs but this did not work for me, so I had to break it out into two steps.
    • Important: Be sure to include the appropriate size unit after the growth amount, like G for gigabytes. See the lvreduce man page.
  5. resize2fs {shrinkPath} to extend the partition to fit the volume.
    • Example: resize2fs /dev/debian-vg/home
  6. vgdisplay -C should show {growthAmount} free.
  7. e2fsck -f {growPath} to check for errors.
    • Example: e2fsck -f /dev/debian-vg/root
  8. lvextend -l +100%FREE {growPath} to grow the volume to 100% of the available Volume Group size.
    • Example: lvextend -l +100%FREE /dev/debian-vg/root
  9. resize2fs {growPath} to grow the filesystem to the available space.
    • Example: resize2fs /dev/debian-vg/root
  10. vgdisplay -C should show no free space
  11. You should be all done and free to reboot!
Categories
Thoughts

Turbo Typer

You do not need an alternate keyboard layout like Colemak or Dvorak to type fast. Nor do you need to learn to use the QWERTY “home row keys.” All it takes is a misspent youth and muscle memory.

My typing teacher in high school wasn’t my biggest fan. I won the class typing competitions without using the “proper” techniques she taught. I could tell it pained her to give me the prizes.

Now if I could only think as fast as I can type…

Give it a go!

Categories
Coding Projects WASM

Converting an old ray tracer to Wasm

Back around the time the first iPhone was released (feeling old! 😬) I was taking a course where we built a CPU-based ray tracer in C.

If you’re not familiar with the ray tracing technique, check it out here.

What I mostly remember from the course was math, pointers, and segfaults. Oh the segfaults. By the end of the course, however, I had a decent grasp on C and that’s been a valuable skill so many times in my career and hobbies.

How the original project functioned

  1. You wrote an input file that describes various shapes, light sources, and attributes of the environment.
  2. You fed the file into the C program.
  3. You waited a bit (remember, we’re doing everything on the CPU in ~2007).
  4. You got a fancy PPM file. (PPM was a great image format due to its simplicity – we were dealing with enough!)

Wasm motivation

Recently I decided that I wanted to learn more about the inner workings of WebAssembly (Wasm) and figured this would be a great candidate project. It’s fully contained without any external dependencies, *I wrote all the code so it shouldn’t be too mysterious, and if I got it to work there would be a visual payoff.

*Feel free to judge some of the rough spots in the code – it was a long time ago!

Process

The first thing I made sure of was that I could compile the project locally the non-Wasm way. There were no hiccups there – it worked on the first try using the Makefile. ✅

I then started reading this tutorial on converting a C project to Wasm. After installing emscripten on macOS (I used Homebrew) I decided to add a new C source file to the project and added a function that looked something like:

#include "emscripten.h"

EMSCRIPTEN_KEEPALIVE
int version() {
  return 42;
}

If I could get this to work I could at least get information from C -> JS for starters. All that it took to make this work was:

  1. Substituting gcc with emcc in the Makefile
  2. Making sure I added the EXPORTED_RUNTIME_METHODS='["cwrap"]' compiler flag
  3. Calling Module.cwrap from JS to use the function

That was pretty much it. I’m not going to go super in-depth with this blog post because I think most of it can be figured out from the source.

Next challenges

I had a bit more to go but was surprised at how easy it was to send a value from C to JS. The next items to figure out were:

  1. The ray tracer expected an input file and an output file, how would this work with a browser?
  2. We can pass integers easily, but what about the big array of pixel data when we ultimately generate an image?
  3. Where would our C calls to fprintf and its siblings go when trying to debug to stdout and stderr?
  4. What about main – does it run?

I’ll go ahead and spoil these really quickly in their respective order:

  1. fmemopen saved the day by taking the input string (which is a char *) and providing a FILE type in return which is an in-memory buffer stream. In other words, no massive overhauling needed although we aren’t using “real” files anymore. In addition a slight refactoring was done to the project to return an array of pixels rather than write out an image file.
  2. From what I understand, Wasm and C can share a heap and both just need to know where to find the data via pointers. Here’s an example of sending a pointer to C, and here’s an example of how JS grabs a pointer from C. In the latter, C sets a global (gross, I know, but they used it in their examples as well) and JS calls a function to get that int. It then is able to initialize an array of UInt8s.
  3. They automagically show up in the browser console! This is a really nice feature, and stderr calls are even properly displayed as errors.
  4. Yes! In my case I got rid of it because it was prompting for the CLI input, but it was interesting to see that it automatically ran. There may be a compiler setting to disable this.

A summary of what it took to convert the ray tracer to Wasm

  1. Tweaked the Makefile to use emcc.
  2. Removed main() because I didn’t need it.
  3. Used fmemopen to substitute a real file with a char array.
  4. Refactored the project to not try to write to a file, but instead return a big array of pixels that ultimately get passed to JS to write to a Canvas.
  5. Expanded the pixel struct to include an alpha channel for the expected RGBA format. Yay for properly using sizeof throughout the code.
  6. Wrote a C source file with everything we needed to interface with JS.
  7. Created an HTML page that calls our compiled JS and gives us access to the exposed functions.
  8. Created a big string using a JS template literal for our input.

That’s mostly it! Check out the GitHub repo here. 🚀

Categories
Fun Retro Computing

Pinball Arcade Trial Achievement

I really hate to brag, but I wanted to share this life achievement. The Pinball Arcade Trial running on Windows 98 informed me that I got the highest score possible for the trial version.

Categories
Nature

Geomagnetic storm

Some front yard shots from the geomagnetic storm on May 10th, 2024. I never thought we’d see one of these in South Carolina!

The sky wasn’t as vibrant to the naked eye due to all the ambient light in my neighborhood, but the iPhone 15 Pro’s sensor picked it up great with a three second exposure.

Categories
WordPress

Uploading media from the Jetpack app to WordPress web

It was cool to see this feature rolled out! Kudos to our teammates who worked hard on this. It was a fun thing to hack on in Munich.

Categories
Keyboards

Keychron Q6

I’ve been eyeing the Keychron Q6 since the middle of 2023 and finally have one in my possession. Here’s why it’s a winner in my book:

  • Full size – you can’t beat the speed of a numpad!
  • Wired – one more unnecessary battery avoided.
  • QMK firmware – highly flexible and open source.
  • Hot swappable switches – in case I get bored with Gateron Browns.
  • Made out of aluminum – a sturdy 5 lbs!
  • Knobs are cool.
  • South-facing RGB LEDs.
  • Has a handy switch to toggle between Mac and PC modes.

After an hour or so of using it, so far so good! It’s exactly what I hoped it’d be, and a leap in modernization and build quality from the New Model M that I’ve been running. I’ll still use Buckling Springs on my retro machines, of course, but it’s so nice to have LEDs, a knob for volume, dedicated function keys for brightness and Spaces, and the flexibility to do whatever I want with custom firmware. 😎

Categories
Places

San Francisco 2024

We had a great time ringing in 2024 with friends in San Francisco!

Some highlights:

Categories
AI SwiftUI

ChatGPT SwiftUI Christmas Tree 2023

Revisiting last year’s challenge, let’s see how this year goes with GPT-4.

I used the same prompt from 2022:

Draw a Christmas tree using SwiftUI

Interestingly this time it included the Triangle type and it was almost line-for-line the exact one I substituted in last year.

import SwiftUI

struct ContentView: View {
    var body: some View {
        ZStack {
            // Background
            Color.white.edgesIgnoringSafeArea(.all)

            // Christmas Tree
            VStack {
                ForEach((1...5).reversed(), id: \.self) { i in
                    TreeLayer(width: CGFloat(i) * 60)
                }
                Rectangle()
                    .frame(width: 40, height: 50)
                    .foregroundColor(.brown)
            }
        }
    }
}

struct TreeLayer: View {
    var width: CGFloat

    var body: some View {
        ZStack {
            Triangle()
                .fill(Color.green)
                .frame(width: width, height: width * 0.75)
            // Add decorations here
        }
    }
}

struct Triangle: Shape {
    func path(in rect: CGRect) -> Path {
        var path = Path()
        path.move(to: CGPoint(x: rect.midX, y: rect.minY))
        path.addLine(to: CGPoint(x: rect.minX, y: rect.maxY))
        path.addLine(to: CGPoint(x: rect.maxX, y: rect.maxY))
        path.closeSubpath()
        return path
    }
}

struct ContentView_Previews: PreviewProvider {
    static var previews: some View {
        ContentView()
    }
}

And here’s the output for 2022 and 2023! 🎄😀