2023 was a big year for generative AI (GenAI). 2024 has all the semblance and signals of being another growth year for GenAI.
Being a maker, creative in coding and designing, it’s easy to see the confusion, fear and xenophobia in technology that is able to make and create the very thing of your profession and passion.
Experimented and tested with the secure my document local version, using metadata of our UI component libraries, across platforms, technologies and frameworks.
It surprised me in the following ways.
- Added ARIA capabilities and attributes based on LLM knowledge and metadata
- Added dynamic unique IDs for list items
- Used imports from our components libraries
- Used the exact components when asked to component screens and pages, like for example a login screen
- Understood the property types, dependencies and inheritances
I also started seeing the limitations.
- When using our accessible WCAG compliant Angular components, it added superfluous attributes that were already baked into the components, through directives and element/attribute components.
- Similar situation in adding dynamic unique IDs superfluously.
- Server side rendering or platform specific rendering capabilities.
- Web-platform: It doesn’t have a render engine in the background, where it renders, tests the rendering through various testing methods. So, it does not know what to add and what is superfluous.
- Native mobile platform: For Flutter UI components, it’s not exactly hooked to a Android Studio environment, iOS simulator or physical devices to render, test and then provide the exact tested code.
On the design side, it also cannot judge, gauge the composition of a UI design screen or page; zoom-in or out like how a designer would; evaluate the product requirement metrics etc.
However, it does HUGELY HELP in eliminating and busting the agony of the blank canvas or code file for designers and developers respectively.
I got an opportunity to experiment and explore during our annual hackathon with a primary GenAI theme. My hackathon project vision was aimed at helping developers navigate the assets of the design system with ease, by bringing the code, workflow and community to the developer. For the developer, the vision was to provide a VS Code extension and an experience similar to Github Copilot Chat. Additionally the developer would receive a CLI that complements the extension in all the right ways. For the designer, the solution will come in the form of a Figma plugin that helps them with key workflows. I love hacking and envisioning. This has hugely benefitted my career journey and professional fulfillment.
The age of AI is no different. Hackers, experimenters, explorers and builders will reap huge benefits as they keep learning, doing, teaching and repeating the loop.
Here is my understanding and reflection.
- We gotta keep finding the balance in using any new tool or technology.
- We gotta work with it and learn with it.
- We need to hack around and see where we can take it
Being a Star Wars fan from an early age, the image of the team-up of R2D2 and Anakin/Luke Skywalker on their adventures has always been a strong reference point. There was a nod to this in Christopher Nolan’s movie Interstellar as well with TARS and Cooper.
The combination of human and machine is always way more powerful than either in isolation.
Well, I say this and I wonder what the age of AGI will bring 😂. I will continue staying a student of this world.
Keep hacking and building my friends. It’s an ageless, timeless attitude and disposition.