04 Button Experiments

After weeks of reading and planning, I finally started making things. Built my first computational experiments, button prototypes with different personality parameters. The mind map peer exercise was humbling: people didn't understand my research because I was using too much academic jargon. Andreas's feedback pushed me out of the "thinking trap" and into actual prototyping. The shift from theory to making revealed things I couldn't have learned just by reading.

  • Week-four
    08 ~ 14, Sep, 2025
  • Journal-by
    Choi Yerin
  • Keywords
    • Making
    • Buttons
    • Case-studies
    • patterns
    • Consultation

Starting with the Fundamentals

Buttons

Airbnb: Get-Started

ChatGPT: Export

CRED: All-in-Jackpot-spin

This week I committed to actually making something rather than just reading and planning. After weeks of theoretical exploration, I decided to start my computational experiments with buttons. Which is the most fundamental interaction element in digital interfaces. Buttons are everywhere. Every app, every website, every interface has them. They're often the first thing users interact with when entering a digital space.

If interactions can carry brand identity, buttons seemed like the perfect test case. They are universal enough to be relevant across contexts, but potentially distinctive enough to carry brand personality.

Button Experiment 01:
Sketching Three Button Personalities

I sketched three different button prototypes in p5.js, keeping everything visually identical with same grey colors, same basic shape, but varying the interaction behaviors.

First Button Sketch _ p5.js
Press each button to interacti with

It's not perfectly working yet. Still, I got several insights. I could get such interactions by differentiating properties like easing, maxScale, (currentScale=1), pressDepth,
and releaseSpeed.

What struck me was how dramatically different these buttons felt to press, even though they looked nearly identical. Just changing timing and easing curves created completely different brand impressions.

Button Experiment 02:
Prototyping Parameter Laboratory

From the first experiment, I tried developing web-based prototype, with a more comprehensive testing setup that allowed real-time adjustment of timing, easing, scale, and shadow.

Screen record of prototype
Access prototype from Here

I included preset configurations for different brand personalities and added data collection capabilities to record user responses, since I am planning to do user evaluation once I develop more of such prototypes.

Case Study Progress

Pattern Recognition:
Instagram & Duolingo

Instagram:
long press to react

Instagram:
hold to slide post

While collecting interaction examples this week, I noticed something about Instagram's patterns. They use "hold / long-press to action" interactions extensively throughout their app. Users hold to view stories, hold to preview reels, hold for quick actions. This creates a deliberate, mindful engagement pattern that aligns with their recent brand pivot toward intentional sharing rather than mindless scrolling. It certainly is a part of usability choices for faster actions, but also it seems like the interaction itself embodies their intimate brand values as well.

Duolingo:
set new streakgoal

Duolingo:
Chess - 5 in a row

Duolingo:
30days streak

While collecting and using duolingo in person, I could also find it's patterns, different but just as effective through a lot of haptic feedbacks and rewarding animations. Every correct answer, every streak milestone, every achievement comes with physical celebration through vibration.

This isn't just functional feedback, but it's their playful, encouraging brand personality expressed through the sense of touch. The physical satisfaction reinforces their brand promise of making learning enjoyable. These patterns aren't accidental. They're deliberately designed to embody brand values.

Realization on buttons

When I first decided to focus on buttons, I was mostly thinking about the obvious ones, like rectangular shapes with labels, call-to-action buttons, submit buttons.
But as I started collecting more examples, I realized "buttons" are actually much broader than that. Technically, anything you can tap, press, or hold that triggers an action is a button. Instagram's "hold to view story"? That's a button interaction, just not shaped like a traditional button. Duolingo's character animations you tap for sound? Buttons. Even swipeable cards or draggable elements can function as buttons. Buttons are interactive touch targets that respond to input and trigger state changes.

This realization expanded what I needed to study and experiment more on. I'm not just looking at visual button design, but I'm examining button behaviors regardless of their visual form. The interaction pattern matters more than the rectangular shape.

Consultations

Feedback:
Simplify, Think outside of screen

Key Takeaways:

"Simplify more, expand research, don't use too big words."

My research question has become overly academic and disconnected from real users and real problems. I need to bring it back to clearer, more accessible language.

"Look beyond screen."

He suggested exploring how touching textiles or grass in the morning—when my senses are more sensitive—could inform screen interactions.

"Look beyond screen", this literally opened up a dimension I hadn't considered. I'd been so focused on digital parameters (timing, easing, scale) that I'd forgotten about the rich world of physical sensations that could inform digital feedback design.

How does the resistance of pressing into soft grass compare to the sharp feedback of tapping hard plastic? How could the gradual give of pressing into fabric inform the easing curves of a button press animation? This opened up a completely new experimental direction.

Next Steps

  • I need to address several things moving forward:

  • Research question clarity:
    Using the HMW (How Might We) formula to make my question less academic and more focused. Right now it's too broad.

  • Target user identification:
    I need to determine who actually benefits from better understanding of branded interactions. ? designers, brand managers, developers, or end users themselves.

  • Design-first approach:
    Focus on human perception and experience first, then use technology to explore and express those insights. The technical capabilities should serve design understanding, not drive it.