iiSM.ORG
Joinnonprofit

Ride the lightning: The art of Creative Motivation

Gandalf Hudlow @GandalfHudlow bio
2025/02/11

Over the past few months, I’ve felt the gentle pull of a far-off, irresistible force - an idea that quietly crept into my thoughts. It started as fleeting visions of a JSON packet: a list of available paths on the left, criteria on the right, and a neatly generated story, ready to paste into a chat/email.

 

The vision looked something like this:

Hobbies payload for John

Missing fields:

 no missing fields!

Found fields:

* `person.name` contains value `John`
* `contacts.0.name` matches value `Jill`
* `contacts.*.age` contains value `22`

By late 2024, these thoughts became so persistent that I had to act. Over the years, I’ve learned that it’s usually better to lean into these creative impulses. They bring with them a rare kind of motivation - the drive to push through the messy, sometimes discouraging process of learning a new technology or framework.

 

So, I’d caught the bug. Now, what to do about it? The first step was deciding what tech stack I could stomach:

What finally grabbed my interest was reading about advancements in web applications using WebAssembly, and how Rust compiles to WebAssembly. That sounded pretty neat!

 

So I settled on a simple proof of concept using JS + WASM to see how it practically works. My vision started with an available list of JSON paths to each node. Using serde_json seemed to be the go to JSON library for rust and, with some iteration, this function did the trick!

//input:  {"contacts": [{"name": "John", "age": 22]}
//output: ["contacts.0.name", "contacts.0.age"]
#[wasm_bindgen]
pub fn extract_json_paths(json_str: &str) -> JsValue {
    //todo error handling
    let parsed_json: Value = serde_json::from_str(json_str).unwrap();

    let mut paths: Vec<String> = Vec::new();

    recurse_json(&parsed_json, String::new(), &mut paths);

    to_value(&paths).unwrap()
}

fn recurse_json(value: &Value, current: String, paths: &mut Vec<String>) {
    match value {
        Value::Object(map) => {
            if map.is_empty() {
                paths.push(current.clone());
            }
            for (k, v) in map {
                let new_path = if current.is_empty() {
                    k.clone()
                } else {
                    let dot = ".";
                    let mut new_path = 
                        String::with_capacity(current.len() + 
                            dot.len() + k.len());
                    new_path.push_str(&current);
                    new_path.push_str(dot);
                    new_path.push_str(k);
                    new_path
                };
                recurse_json(v, new_path, paths);
            }
        }

        Value::Array(a) => {
            if a.is_empty() {
                paths.push(current.clone());
            }
            for (index, item) in a.iter().enumerate() {
                let path_parts = [&current, ".", &index.to_string()];
                let new_path = path_parts.concat();
                recurse_json(item, new_path, paths);
            }
        }

        _ => {
            paths.push(current);
        }
    }
}

Great, I have step 1 for my new killer app! Next up I need to run the WASM output from the rust code in the browser. The web assembly build output creates a couple of files, a .wasm file and a .js wrapper file that we can use to fire up the web assembly, here are a couple of key snippets (wasm_bindgen comes from executing the wrapper script):

<script src="no_modules/jsonstory.js"></script>

//Load the wasm
fetch("no_modules/jsonstory_bg.wasm")
      .then((response) => response.arrayBuffer())
      .then((bytes) => {
        wasm_bindgen.initSync({ module: bytes });

//Use the wasm

fileInput.addEventListener('change', (event) => {
        const file = event.target.files[0];

        const reader = new FileReader();
        reader.onload = (event) => {
          const fileContent = event.target.result;
          const { extract_json_paths } = wasm_bindgen;
          const paths = extract_json_paths(fileContent);

At this point I'm still just working with vanilla JS and DOM manipulation so off we go!

//Some HTML
<div class="layout vertical">
  <h4>Available json paths</h4>
  <div class="layout horizontal">
    <div class="layout vertical" id="jsonBlob1Paths">
    </div>
  </div>
</div>

//Some more JS to append those extracted paths into my HTML above

const paths = extract_json_paths(fileContent);
const pathsElement = e('jsonBlob1Paths');
paths.forEach((path) => {
  const pathDiv = document.createElement("div");
  const pathButton = document.createElement("button");
  pathButton.textContent = "+";
  pathButton.on("click") = () => {

  }
  pathDiv.textContent = path;
  pathDiv.appendChild(pathButton);
  pathsElement.appendChild(pathDiv);
});

My gut started sending warning signals that the path I was on might not be sustainable. I brushed it off - after all, how could I really know? It wasn’t until I hit this painful block of code, duplicating existing DOM elements as a makeshift templating system, that I finally paused, stopped typing, and started thinking (code hidden behind button because shame):

 function newBlob() {
  const blobsDiv = e("jsonBlobs");
  const newBlobIndex = blobsDiv.childNodes.length + 1;
  const blob1 = e("jsonBlob1");
  const newBlob = blob1.cloneNode(true);
  const inputId = blobId("jsonBlobInput", newBlobIndex);
  const newInput = newBlob.querySelector("#jsonBlobInput1");
  newInput.id = inputId;
  newInput.value = "";
  newBlob.id = blobId("jsonBlob", newBlobIndex);
  const deleteButton = newBlob.querySelector("#jsonBlobDelete1");
  deleteButton.id = blobId("jsonBlobDelete", newBlobIndex);
  deleteButton.style.display = "block";
  deleteButton.addEventListener("click", () => {
    blobsDiv.removeChild(newBlob);
  });
  const criterias = newBlob.querySelector("#jsonBlobCriterias1");
  criterias.id = blobId("jsonBlobCriterias", newBlobIndex);
  newBlob.querySelector("#jsonBlob1CriteriaValue1").id = 
    criteriaId("jsonBlobCriteriaValue", newBlobIndex, 1);
  newBlob.querySelector("#jsonBlob1CriteriaPath1").id = 
    criteriaId("jsonBlobCriteriaPath", newBlobIndex, 1);
  const addCriteriaButton = newBlob.querySelector("#jsonBlobAddCriteria1");
  addCriteriaButton.addEventListener("click", () => {
    const newCriteria = criteria.cloneNode();
    criterias.appendChild(newCriteria);
  });
  blobsDiv.appendChild(newBlob);
}

Ugh. My future lays out in front of me, hacking and hacking and hacking more and more DOM manipulations, reinventing some sort of a templating system, or worse yet standing up and debugging yet another pile of web frameworks that I've already been through a few times. Worse than that, I'll be so bogged down dealing with an ever increasing mess of hacked in bindings that I won't even be able to use the output, let alone other people! This last point is what just kills it, the only thing more demotivating than making code that nobody will use is making code that no one can use!

Creating new things is a discovery process that involves a variety of different kinds of failure. Here we see running into a dead end during tech stack discovery. Having supportive leaders that understand this process is really helpful. When leaders don't understand the process, discovery just looks like a bunch of failure that should be skipped/avoided so we can all just be successful. I mean, we might as well skip to the end and be successful, right?

Factory vs. Dev Motivation→

 

At that point, I stepped away for a few days. The approach didn’t feel right, and I couldn’t see a compelling way forward. And by “compelling,” I mean something that genuinely excites me. This is a self-motivated project, not a job with deliverables. If it’s not compelling, it’s not going anywhere.

 

Ok, so I'm not interested in firing up yet another javascript framework. What to do? During my break from this project dug I around around on reddit to see what other people are doing to create web assembly front ends in the browser and found some chatter about a framework called Leptos. This had more than enough novelty to appeal! I straightaway dove in to see what it was all about. First thing is that Leptos follows a bit of a react parallel, rough mappings being:

 

Here is an example of a leptos component that shows some hint text above bouncing yellow downward pointing arrow.

#[component]
pub fn Hint(hint_text: String, show_hint: RwSignal<bool>) -> impl IntoView {
    view! {
        <div style=move || if show_hint.get() { "position: relative" } else { "display:none" }>
            <div style="position: absolute; right: 0; top:-20px">
                <Button
                    title="Close".to_string()
                    on:click=move |_| {
                        show_hint.set(false);
                    }
                >
                    {"X"}
                </Button>
            </div>
            <div class="hint-container">
                <div class="hint-callout">{hint_text}</div>
            </div>
            <div class="hint-bouncing-arrow" style="font-size: 48px">
                {"⬇"}
            </div>
        </div>
    }
}

Usage of the component is pretty straightforward, as it can be composed into other components easily. In this case leptos <Show is used to conditionally display the component:

//Show only if there are available paths
<Show when=move || { json_paths.get().len() > 0 }>
    <Hint show_hint=show_hint 
        hint_text="Press [⇛] below to 
            add field to criteria!".to_string() 
    />
</Show>

As you can imagine, tackling Rust’s borrow rules - which I’m not very experienced with - while also learning the intricacies of a new framework has been quite the learning curve. And with learning comes failure, plenty of it, in this case.

Failure: What do you mean my signal is gone?

The basic structure of the data goes something like this, a vector of BlobEntry objects that have the following fields:

In addition most of the fields above are wrapped in Leptos signal objects, which are basically a reactive state object that UI components can subscribe to so the UI component will re-render when the signal changes. So succinctly put: An array of BlobEntry objects, each of which contains an array of both Criteria objects and JsonPath objects, where most fields are wrapped in a Leptos signal.

 

Ah yes, the best laid plans and all that. Things were “working” great! Opening a file resulted in the list of paths populating! Pressing the [+] button result in new criteria being added! I started sprinkling create_effects here and there to achieve little mini-ideas. However, a random glance at the JS console in the browser alerted me to a lot of red text complaining of panics because some signal was being used after it was disposed! Well that doesn't sound good, now does it?

 

At this point it was time to start thinking again! One thought was to get clear on the life cycle of the BlobEntry and Criteria objects - were they being created and destroyed when I least expected it? To get the answer to that question I used Drop which is a way to log something when an object destructs in rust.

impl Drop for Criteria {
    fn drop(&mut self) {
        logging::log!("Dropping Criteria: {}", self.key);
    }
}

By sprinkling the above on my domain objects I started to get a sense of when they were being destroyed. My first experiment to solve the panics was to wrap all my objects in Arc:

//Before:

criterias: ReadSignal<Vec<Criteria>>

//After:

criterias: ReadSignal<Vec<Arc<Criteria>>>

This didn't actually help as much as I thought it would. But I kept it around to eliminate any interesting edge effects where signals inside each domain object would be copied and dropped during various array/object manipulations, etc. Probably works just fine without Arc, but why risk it? Ok let's be honest, I just didn't want to go back - so let's go forward!

What I'm doing here is avoiding discovery, the right play is to dig in even deeper than I have because my users are for sure going to suffer because of my lack of understanding. The brutal truth is that discovery takes time and is painted with the ugly brush in most organizations because it really can't be easily marketed as success™

In software R&D, Discovery is the R →

I decided to dig deeper into Leptos signals to see if I was overlooking something. While the documentation didn’t provide much clarity, a bit of Googling led me to an interesting code snippet from a discussion:

create_effect(move |_| {
    create_rw_signal(0); // <- bad and I know it
});

Apparently, creating new signals inside a create_effect is a big no-no. Well, now “I know it”! I ran a few experiments to test this further. Instead of reacting to signal changes with create_effect to push a new Criteria object, I modified the code to push the new Criteria object directly in the click handler. The result? No more panics!

Failure: Well of course my signal is gone, I deleted it, why the panic?

At this point, my app handles adding new BlobEntry and Criteria objects seamlessly with no drama at all. But what if a user wants to remove a BlobEntry or Criteria object? Removing Criteria objects worked like a charm right out of the box! As The Hitchhiker's Guide to the Galaxy famously advises, “Don't panic!” However, removing BlobEntry objects was a completely different story, panic all the way down!

 

After some investigative debugging (a mix of commenting out code and testing various scenarios), I narrowed the root cause to removing a BlobEntry that has a non-empty Criteria array. Based on what I’ve read about the changes from Leptos 0.6 to 0.7 (I’m currently on 0.6), this might be something addressed in a later version. However, upgrading looked far from trivial, and let’s be honest - stable, working software is far more compelling than than upgrading my unfinished software!

 

To tackle this issue, I introduced a delete_date field. Instead of directly removing a BlobEntry, the field marks it as deleted in memory and local storage. Deleted blobs are excluded from being loaded and are visually filtered out in the UI. This workaround did require some annoying rework to the active tab logic, but the tradeoff? No more panics. For now, I’ll take that as a win!

The only thing more demotivating than making code that nobody will use is making code that nobody can use! This point is worth repeating, too much Chaos very much destroys value - people just won't try your stuff out if it is unusable. The solution is to instrument your code to detect, and spend the time to kill, chaos.

Detecting Chaos in Software Systems →

Failure: Wait why doesn't this work as a chrome extension?

One goal I had was to package the code as a chrome extension, reason being that many organizations have firewalls that filter out new websites, but chrome extensions - especially zero permission chrome extensions are allowed without question. In addition since I'd never made one before, it would be a good learning experience. I'd settled on using the trunk tool to generate the wasm + js wrapper that launched my web app. With a bit of research I came up with this basic manifest.json that referenced trunk's output.

{
  "action": {
    "default_icon": "images/logo_256.png",
    "default_title": "Open Json Story"
  },
  "background": {
    "service_worker": "background.js"
  },
  "description": "Json field validator that 
           outputs a story to be pasted into Slack",
  "host_permissions": [],
  "icons": {
    "128": "images/logo_128.png",
    "64": "images/logo_64.png",
    "256": "images/logo_256.png"
  },
  "incognito": "split",
  "manifest_version": 3,
  "content_security_policy": {
    "extension_pages": "script-src 'self' 
         'wasm-unsafe-eval';"
  },
  "name": "Json Story",
  "permissions": [],
  "version": "1.1.6"
}

The very first thing I ran into was a script error complaining that inline scripts aren't allowed, and with some trial and error, finally traced it down to the fact that trunk generates an inlined script in the index.html output used to serve up the wasm. With some determined Googling, I stumbled upon a discussion where someone faced the exact same issue. Fortunately, another user had shared a hilarious snippet they added to their build script to extract the inline script into a separate file as part of their chrome extension publish step

sed -n '/<script type="module"/,/<\/script>/p' $TRUNK_STAGING_DIR/index.html > \
  $TRUNK_STAGING_DIR/test.js
sed -i '/script/d' $TRUNK_STAGING_DIR/test.js
sed -i '/<script type="module"/,/<\/script>/d' $TRUNK_STAGING_DIR/index.html
sed -i "s@</title>@</title><script type=\"module\" src=\"${TRUNK_PUBLIC_URL}test.js\"></script>@" \
   $TRUNK_STAGING_DIR/index.html
TRUNK_STAGING_DIR="./chrome_extension"

Long story short, with sed all things are possible.

Failure: Wait you want to save AND load from local storage?

As soon as the code for persisting both the BlobEntry and Criteria objects was created, the panics came back! So I started scrubbing for any remaining places where signals were being created inside of a create_effect which didn't really impact the issue. After some differential diagnostics the issue narrowed down to loading BlobEntry objects that contained Criteria objects. On a hunch I implemented loading inside of multiple use_timeout_fn callbacks, thereby staging the construction of the objects in different rounds of the event loop, loading BlobEntry first and then each entries' Criteria objects

let UseTimeoutFnReturn { start, .. } = use_timeout_fn(move |_: i32| {
  if local_storage_loaded.get() == 0 {
   load_from_storage(&blob_entries, &set_blob_entries);
   set_local_storage_loaded.set(1);
  } else if local_storage_loaded.get() == 1 {
     let mut blob_title_opt: Option<String> = None;
     for blob in blob_entries.get() {
      blob_title_opt = Some(blob.blob_title.get());
      load_criterias_from_storage(&blob.criterias, &blob.key, &blob.storage_key);
     } 
  }
},
0.0 /*setTimeout delay: 0*/);

create_effect(move |_| {
 if local_storage_loaded.get() == 0 {
    start(0);
 } else if local_storage_loaded.get() == 1 {
    start(0);
 }
});

Since most people are raised to see failure as something negative, teams engaged in discovery thrive when leadership actively supports and celebrates the learning that comes from it. Shifting the focus to learning helps counteract the deeply ingrained, motivation-crushing bias against failure, fostering a more innovative and resilient team culture.

Energize Creative Staff! →

Failure: What do you mean you can't understand my UI?

Once the worst of the chaos in the core functionality was finally under control (at least for me), it was time to find my first test subject! The great thing about immediate family is that they love you enough to engage with your creations - one of the strongest arguments for having them, in my opinion! So, I cornered my son at his computer, giving him the perfect opportunity to demonstrate his boundless affection for me.

 

His first question: “Why don’t these [+] buttons expand the JSON?“

 

A simple answer: “Oh, those are actually for adding a new criteria using that path and value.“

 

But the real answer? The realization that, to most people, [+] in the context of JSON implies expansion, not addition. Clearly, a new icon was needed. And so, the [+] transformed into [⇛]!

 

This also led to the creation of the <Hint> component (mentioned earlier), designed to quickly guide users through the basics.

Learnings

This manifestation of a Creative Attractor resulted in some key learnings:

What kept me going in the end was the thought that someone will use my creation. And at this stage in the process, I definitely count as someone!

You might have missed these articles