Archive

Archive for the ‘’ Category

WordPress Multi-Multisite: A Case Study

November 27th, 2024 No comments

The mission: Provide a dashboard within the WordPress admin area for browsing Google Analytics data for all your blogs.

The catch? You’ve got about 900 live blogs, spread across about 25 WordPress multisite instances. Some instances have just one blog, others have as many as 250. In other words, what you need is to compress a data set that normally takes a very long time to compile into a single user-friendly screen.

The implementation details are entirely up to you, but the final result should look like this Figma comp:

Design courtesy of the incomparable Brian Biddle.

I want to walk you through my approach and some of the interesting challenges I faced coming up with it, as well as the occasional nitty-gritty detail in between. I’ll cover topics like the WordPress REST API, choosing between a JavaScript or PHP approach, rate/time limits in production web environments, security, custom database design — and even a touch of AI. But first, a little orientation.

Let’s define some terms

We’re about to cover a lot of ground, so it’s worth spending a couple of moments reviewing some key terms we’ll be using throughout this post.

What is WordPress multisite?

WordPress Multisite is a feature of WordPress core — no plugins required — whereby you can run multiple blogs (or websites, or stores, or what have you) from a single WordPress installation. All the blogs share the same WordPress core files, wp-content folder, and MySQL database. However, each blog gets its own folder within wp-content/uploads for its uploaded media, and its own set of database tables for its posts, categories, options, etc. Users can be members of some or all blogs within the multisite installation.

What is WordPress multi-multisite?

It’s just a nickname for managing multiple instances of WordPress multisite. It can get messy to have different customers share one multisite instance, so I prefer to break it up so that each customer has their own multisite, but they can have many blogs within their multisite.

So that’s different from a “Network of Networks”?

It’s apparently possible to run multiple instances of WordPress multisite against the same WordPress core installation. I’ve never looked into this, but I recall hearing about it over the years. I’ve heard the term “Network of Networks” and I like it, but that is not the scenario I’m covering in this article.

Why do you keep saying “blogs”? Do people still blog?

You betcha! And people read them, too. You’re reading one right now. Hence, the need for a robust analytics solution. But this article could just as easily be about any sort of WordPress site. I happen to be dealing with blogs, and the word “blog” is a concise way to express “a subsite within a WordPress multisite instance”.

One more thing: In this article, I’ll use the term dashboard site to refer to the site from which I observe the compiled analytics data. I’ll use the term client sites to refer to the 25 multisites I pull data from.

My implementation

My strategy was to write one WordPress plugin that is installed on all 25 client sites, as well as on the dashboard site. The plugin serves two purposes:

  • Expose data at API endpoints of the client sites
  • Scrape the data from the client sites from the dashboard site, cache it in the database, and display it in a dashboard.

The WordPress REST API is the Backbone

The WordPress REST API is my favorite part of WordPress. Out of the box, WordPress exposes default WordPress stuff like posts, authors, comments, media files, etc., via the WordPress REST API. You can see an example of this by navigating to /wp-json from any WordPress site, including CSS-Tricks. Here’s the REST API root for the WordPress Developer Resources site:

The root URL for the WordPress REST API exposes structured JSON data, such as this example from the WordPress Developer Resources website.

What’s so great about this? WordPress ships with everything developers need to extend the WordPress REST API and publish custom endpoints. Exposing data via an API endpoint is a fantastic way to share it with other websites that need to consume it, and that’s exactly what I did:

Open the code

<?php

[...]

function register(WP_REST_Server $server) {
  $endpoints = $this->get();

  foreach ($endpoints as $endpoint_slug => $endpoint) {
    register_rest_route(
      $endpoint['namespace'],
      $endpoint['route'],
      $endpoint['args']
    );
  }
}

function get() {

  $version = 'v1';

  return array(
      
    'empty_db' => array(
      'namespace' => 'LXB_DBA/' . $version,
      'route'     => '/empty_db',
      'args'      => array(
        'methods' => array( 'DELETE' ),
        'callback' => array($this, 'empty_db_cb'),
        'permission_callback' => array( $this, 'is_admin' ),
      ),
    ),

    'get_blogs' => array(
      'namespace' => 'LXB_DBA/' . $version,
      'route'     => '/get_blogs',
      'args'      => array(
        'methods' => array('GET', 'OPTIONS'),
        'callback' => array($this, 'get_blogs_cb'),
        'permission_callback' => array($this, 'is_dba'),
      ),
    ),

    'insert_blogs' => array(
      'namespace' => 'LXB_DBA/' . $version,
      'route'     => '/insert_blogs',
      'args'      => array(
        'methods' => array( 'POST' ),
        'callback' => array($this, 'insert_blogs_cb'),
        'permission_callback' => array( $this, 'is_admin' ),
      ),
    ),

    'get_blogs_from_db' => array(
      'namespace' => 'LXB_DBA/' . $version,
      'route'     => '/get_blogs_from_db',
      'args'      => array(
        'methods' => array( 'GET' ),
        'callback' => array($this, 'get_blogs_from_db_cb'),
        'permission_callback' => array($this, 'is_admin'),
      ),
    ),  

    'get_blog_details' => array(
      'namespace' => 'LXB_DBA/' . $version,
      'route'     => '/get_blog_details',
      'args'      => array(
        'methods' => array( 'GET' ),
        'callback' => array($this, 'get_blog_details_cb'),
        'permission_callback' => array($this, 'is_dba'),
      ),
    ),   

    'update_blogs' => array(
      'namespace' => 'LXB_DBA/' . $version,
      'route'     => '/update_blogs',
      'args'      => array(
        'methods' => array( 'PATCH' ),
        'callback' => array($this, 'update_blogs_cb'),
        'permission_callback' => array($this, 'is_admin'),
      ),
    ),     

  );
}

We don’t need to get into every endpoint’s details, but I want to highlight one thing. First, I provided a function that returns all my endpoints in an array. Next, I wrote a function to loop through the array and register each array member as a WordPress REST API endpoint. Rather than doing both steps in one function, this decoupling allows me to easily retrieve the array of endpoints in other parts of my plugin to do other interesting things with them, such as exposing them to JavaScript. More on that shortly.

Once registered, the custom API endpoints are observable in an ordinary web browser like in the example above, or via purpose-built tools for API work, such as Postman:

JSON output.

PHP vs. JavaScript

I tend to prefer writing applications in PHP whenever possible, as opposed to JavaScript, and executing logic on the server, as nature intended, rather than in the browser. So, what would that look like on this project?

  • On the dashboard site, upon some event, such as the user clicking a “refresh data” button or perhaps a cron job, the server would make an HTTP request to each of the 25 multisite installs.
  • Each multisite install would query all of its blogs and consolidate its analytics data into one response per multisite.

Unfortunately, this strategy falls apart for a couple of reasons:

  • PHP operates synchronously, meaning you wait for one line of code to execute before moving to the next. This means that we’d be waiting for all 25 multisites to respond in series. That’s sub-optimal.
  • My production environment has a max execution limit of 60 seconds, and some of my multisites contain hundreds of blogs. Querying their analytics data takes a second or two per blog.

Damn. I had no choice but to swallow hard and commit to writing the application logic in JavaScript. Not my favorite, but an eerily elegant solution for this case:

  • Due to the asynchronous nature of JavaScript, it pings all 25 Multisites at once.
  • The endpoint on each Multisite returns a list of all the blogs on that Multisite.
  • The JavaScript compiles that list of blogs and (sort of) pings all 900 at once.
  • All 900 blogs take about one-to-two seconds to respond concurrently.

Holy cow, it just went from this:

( 1 second per Multisite * 25 installs ) + ( 1 second per blog * 900 blogs ) = roughly 925 seconds to scrape all the data.

To this:

1 second for all the Multisites at once + 1 second for all 900 blogs at once = roughly 2 seconds to scrape all the data.

That is, in theory. In practice, two factors enforce a delay:

  1. Browsers have a limit as to how many concurrent HTTP requests they will allow, both per domain and regardless of domain. I’m having trouble finding documentation on what those limits are. Based on observing the network panel in Chrome while working on this, I’d say it’s about 50-100.
  2. Web hosts have a limit on how many requests they can handle within a given period, both per IP address and overall. I was frequently getting a “429; Too Many Requests” response from my production environment, so I introduced a delay of 150 milliseconds between requests. They still operate concurrently, it’s just that they’re forced to wait 150ms per blog. Maybe “stagger” is a better word than “wait” in this context:
Open the code
async function getBlogsDetails(blogs) {
  let promises = [];

  // Iterate and set timeouts to stagger requests by 100ms each
  blogs.forEach((blog, index) => {
    if (typeof blog.url === 'undefined') {
      return;
    }

    let id = blog.id;
    const url = blog.url + '/' + blogDetailsEnpointPath + '?uncache=' + getRandomInt();

    // Create a promise that resolves after 150ms delay per blog index
    const delayedPromise = new Promise(resolve => {
      setTimeout(async () => {
        try {
          const blogResult = await fetchBlogDetails(url, id);
                
          if( typeof blogResult.urls == 'undefined' ) {
            console.error( url, id, blogResult );

          } else if( ! blogResult.urls ) {
            console.error( blogResult );
                
                
          } else if( blogResult.urls.length == 0 ) {
            console.error( blogResult );
                
          } else {
            console.log( blogResult );
          }
                
          resolve(blogResult);
        } catch (error) {
          console.error(`Error fetching details for blog ID ${id}:`, error);
          resolve(null); // Resolve with null to handle errors gracefully
      }
    }, index * 150); // Offset each request by 100ms
  });

  promises.push(delayedPromise);
});

  // Wait for all requests to complete
  const blogsResults = await Promise.all(promises);

  // Filter out any null results in case of caught errors
  return blogsResults.filter(result => result !== null);
}

With these limitations factored in, I found that it takes about 170 seconds to scrape all 900 blogs. This is acceptable because I cache the results, meaning the user only has to wait once at the start of each work session.

The result of all this madness — this incredible barrage of Ajax calls, is just plain fun to watch:

PHP and JavaScript: Connecting the dots

I registered my endpoints in PHP and called them in JavaScript. Merging these two worlds is often an annoying and bug-prone part of any project. To make it as easy as possible, I use wp_localize_script():

<?php

[...]

class Enqueue {

  function __construct() {
    add_action( 'admin_enqueue_scripts', array( $this, 'lexblog_network_analytics_script' ), 10 );
    add_action( 'admin_enqueue_scripts', array( $this, 'lexblog_network_analytics_localize' ), 11 );
  }

  function lexblog_network_analytics_script() {
    wp_register_script( 'lexblog_network_analytics_script', LXB_DBA_URL . '/js/lexblog_network_analytics.js', array( 'jquery', 'jquery-ui-autocomplete' ), false, false );
  }

  function lexblog_network_analytics_localize() {
    $a = new LexblogNetworkAnalytics;
    $data = $a -> get_localization_data();
    $slug = $a -> get_slug();

    wp_localize_script( 'lexblog_network_analytics_script', $slug, $data );

  }

  // etc.              
}

In that script, I’m telling WordPress two things:

  1. Load my JavaScript file.
  2. When you do, take my endpoint URLs, bundle them up as JSON, and inject them into the HTML document as a global variable for my JavaScript to read. This is leveraging the point I noted earlier where I took care to provide a convenient function for defining the endpoint URLs, which other functions can then invoke without fear of causing any side effects.

Here’s how that ended up looking:

The JSON and its associated JavaScript file, where I pass information from PHP to JavaScript using wp_localize_script().

Auth: Fort Knox or Sandbox?

We need to talk about authentication. To what degree do these endpoints need to be protected by server-side logic? Although exposing analytics data is not nearly as sensitive as, say, user passwords, I’d prefer to keep things reasonably locked up. Also, since some of these endpoints perform a lot of database queries and Google Analytics API calls, it’d be weird to sit here and be vulnerable to weirdos who might want to overload my database or Google Analytics rate limits.

That’s why I registered an application password on each of the 25 client sites. Using an app password in php is quite simple. You can authenticate the HTTP requests just like any basic authentication scheme.

I’m using JavaScript, so I had to localize them first, as described in the previous section. With that in place, I was able to append these credentials when making an Ajax call:

async function fetchBlogsOfInstall(url, id) {
  let install = lexblog_network_analytics.installs[id];
  let pw = install.pw;
  let user = install.user;

  // Create a Basic Auth token
  let token = btoa(`${user}:${pw}`);
  let auth = {
      'Authorization': `Basic ${token}`
  };

  try {
    let data = await $.ajax({
        url: url,
        method: 'GET',
        dataType: 'json',
        headers: auth
    });

    return data;

  } catch (error) {
    console.error('Request failed:', error);
    return [];
  }
}

That file uses this cool function called btoa() for turning the raw username and password combo into basic authentication.

The part where we say, “Oh Right, CORS.”

Whenever I have a project where Ajax calls are flying around all over the place, working reasonably well in my local environment, I always have a brief moment of panic when I try it on a real website, only to get errors like this:

CORS console error.

Oh. Right. CORS. Most reasonably secure websites do not allow other websites to make arbitrary Ajax requests. In this project, I absolutely do need the Dashboard Site to make many Ajax calls to the 25 client sites, so I have to tell the client sites to allow CORS:

<?php

  // ...

  function __construct() {
  add_action( 'rest_api_init', array( $this, 'maybe_add_cors_headers' ), 10 );
}

function maybe_add_cors_headers() {   
  // Only allow CORS for the endpoints that pertain to this plugin.
  if( $this->is_dba() ) {
      add_filter( 'rest_pre_serve_request', array( $this, 'send_cors_headers' ), 10, 2 );
  }
}

function is_dba() {
  $url = $this->get_current_url();
  $ep_urls = $this->get_endpoint_urls();
  $out = in_array( $url, $ep_urls );
          
  return $out;
          
}

function send_cors_headers( $served, $result ) {

          // Only allow CORS from the dashboard site.
  $dashboard_site_url = $this->get_dashboard_site_url();

  header( "Access-Control-Allow-Origin: $dashboard_site_url" );
  header( 'Access-Control-Allow-Headers: Origin, X-Requested-With, Content-Type, Accept, Authorization' );
  header( 'Access-Control-Allow-Methods: GET, OPTIONS' );
  return $served;
  
}

  [...]

}

You’ll note that I’m following the principle of least privilege by taking steps to only allow CORS where it’s necessary.

Auth, Part 2: I’ve been known to auth myself

I authenticated an Ajax call from the dashboard site to the client sites. I registered some logic on all the client sites to allow the request to pass CORS. But then, back on the dashboard site, I had to get that response from the browser to the server.

The answer, again, was to make an Ajax call to the WordPress REST API endpoint for storing the data. But since this was an actual database write, not merely a read, it was more important than ever to authenticate. I did this by requiring that the current user be logged into WordPress and possess sufficient privileges. But how would the browser know about this?

In PHP, when registering our endpoints, we provide a permissions callback to make sure the current user is an admin:

<?php

// ...

function get() {
  $version = 'v1';
  return array(

    'update_blogs' => array(
      'namespace' => 'LXB_DBA/' . $version,
      'route'     => '/update_blogs',
      'args'      => array(
        'methods' => array( 'PATCH' ),
        'callback' => array( $this, 'update_blogs_cb' ),
        'permission_callback' => array( $this, 'is_admin' ),
        ),
      ),
      // ...
    );           
  }
                      
function is_admin() {
    $out = current_user_can( 'update_core' );
    return $out;
}

JavaScript can use this — it’s able to identify the current user — because, once again, that data is localized. The current user is represented by their nonce:

async function insertBlog( data ) {
    
  let url = lexblog_network_analytics.endpoint_urls.insert_blog;

  try {
    await $.ajax({
      url: url,
      method: 'POST',
      dataType: 'json',
      data: data,
      headers: {
        'X-WP-Nonce': getNonce()
      }
    });
  } catch (error) {
    console.error('Failed to store blogs:', error);
  }
}

function getNonce() {
  if( typeof wpApiSettings.nonce == 'undefined' ) { return false; }
  return wpApiSettings.nonce;
}

The wpApiSettings.nonce global variable is automatically present in all WordPress admin screens. I didn’t have to localize that. WordPress core did it for me.

Cache is King

Compressing the Google Analytics data from 900 domains into a three-minute loading .gif is decent, but it would be totally unacceptable to have to wait for that long multiple times per work session. Therefore I cache the results of all 25 client sites in the database of the dashboard site.

I’ve written before about using the WordPress Transients API for caching data, and I could have used it on this project. However, something about the tremendous volume of data and the complexity implied within the Figma design made me consider a different approach. I like the saying, “The wider the base, the higher the peak,” and it applies here. Given that the user needs to query and sort the data by date, author, and metadata, I think stashing everything into a single database cell — which is what a transient is — would feel a little claustrophobic. Instead, I dialed up E.F. Codd and used a relational database model via custom tables:

In the Dashboard Site, I created seven custom database tables, including one relational table, to cache the data from the 25 client sites, as shown in the image.

It’s been years since I’ve paged through Larry Ullman’s career-defining (as in, my career) books on database design, but I came into this project with a general idea of what a good architecture would look like. As for the specific details — things like column types — I foresaw a lot of Stack Overflow time in my future. Fortunately, LLMs love MySQL and I was able to scaffold out my requirements using DocBlocks and let Sam Altman fill in the blanks:

Open the code
<?php 

/**
* Provides the SQL code for creating the Blogs table.  It has columns for:
* - ID: The ID for the blog.  This should just autoincrement and is the primary key.
* - name: The name of the blog.  Required.
* - slug: A machine-friendly version of the blog name.  Required.
* - url:  The url of the blog.  Required.
* - mapped_domain: The vanity domain name of the blog.  Optional.
* - install: The name of the Multisite install where this blog was scraped from.  Required.
* - registered:  The date on which this blog began publishing posts.  Optional.
* - firm_id:  The ID of the firm that publishes this blog.  This will be used as a foreign key to relate to the Firms table.  Optional.
* - practice_area_id:  The ID of the firm that publishes this blog.  This will be used as a foreign key to relate to the PracticeAreas table.  Optional.
* - amlaw:  Either a 0 or a 1, to indicate if the blog comes from an AmLaw firm.  Required.
* - subscriber_count:  The number of email subscribers for this blog.  Optional.
* - day_view_count:  The number of views for this blog today.  Optional.
* - week_view_count:  The number of views for this blog this week.  Optional.
* - month_view_count:  The number of views for this blog this month.  Optional.
* - year_view_count:  The number of views for this blog this year.  Optional.
* 
* @return string The SQL for generating the blogs table.
*/
function get_blogs_table_sql() {
  $slug = 'blogs';
  $out = "CREATE TABLE {$this->get_prefix()}_$slug (
      id BIGINT NOT NULL AUTO_INCREMENT,
      slug VARCHAR(255) NOT NULL,
      name VARCHAR(255) NOT NULL,
      url VARCHAR(255) NOT NULL UNIQUE, /* adding unique constraint */
      mapped_domain VARCHAR(255) UNIQUE,
      install VARCHAR(255) NOT NULL,
      registered DATE DEFAULT NULL,
      firm_id BIGINT,
      practice_area_id BIGINT,
      amlaw TINYINT NOT NULL,
      subscriber_count BIGINT,
      day_view_count BIGINT,
      week_view_count BIGINT,
      month_view_count BIGINT,
      year_view_count BIGINT,
      PRIMARY KEY (id),
      FOREIGN KEY (firm_id) REFERENCES {$this->get_prefix()}_firms(id),
      FOREIGN KEY (practice_area_id) REFERENCES {$this->get_prefix()}_practice_areas(id)
  ) DEFAULT CHARSET=utf8mb4;";
  return $out;
}

In that file, I quickly wrote a DocBlock for each function, and let the OpenAI playground spit out the SQL. I tested the result and suggested some rigorous type-checking for values that should always be formatted as numbers or dates, but that was the only adjustment I had to make. I think that’s the correct use of AI at this moment: You come in with a strong idea of what the result should be, AI fills in the details, and you debate with it until the details reflect what you mostly already knew.

How it’s going

I’ve implemented most of the user stories now. Certainly enough to release an MVP and begin gathering whatever insights this data might have for us:

Screenshot of the final dashboard which looks similar to the Figma mockups from earlier.
It’s working!

One interesting data point thus far: Although all the blogs are on the topic of legal matters (they are lawyer blogs, after all), blogs that cover topics with a more general appeal seem to drive more traffic. Blogs about the law as it pertains to food, cruise ships, germs, and cannabis, for example. Furthermore, the largest law firms on our network don’t seem to have much of a foothold there. Smaller firms are doing a better job of connecting with a wider audience. I’m positive that other insights will emerge as we work more deeply with this.

Regrets? I’ve had a few.

This project probably would have been a nice opportunity to apply a modern JavaScript framework, or just no framework at all. I like React and I can imagine how cool it would be to have this application be driven by the various changes in state rather than… drumrolla couple thousand lines of jQuery!

I like jQuery’s ajax() method, and I like the jQueryUI autocomplete component. Also, there’s less of a performance concern here than on a public-facing front-end. Since this screen is in the WordPress admin area, I’m not concerned about Google admonishing me for using an extra library. And I’m just faster with jQuery. Use whatever you want.

I also think it would be interesting to put AWS to work here and see what could be done through Lambda functions. Maybe I could get Lambda to make all 25 plus 900 requests concurrently with no worries about browser limitations. Heck, maybe I could get it to cycle through IP addresses and sidestep the 429 rate limit as well.

And what about cron? Cron could do a lot of work for us here. It could compile the data on each of the 25 client sites ahead of time, meaning that the initial three-minute refresh time goes away. Writing an application in cron, initially, I think is fine. Coming back six months later to debug something is another matter. Not my favorite. I might revisit this later on, but for now, the cron-free implementation meets the MVP goal.

I have not provided a line-by-line tutorial here, or even a working repo for you to download, and that level of detail was never my intention. I wanted to share high-level strategy decisions that might be of interest to fellow Multi-Multisite people. Have you faced a similar challenge? I’d love to hear about it in the comments!


WordPress Multi-Multisite: A Case Study originally published on CSS-Tricks, which is part of the DigitalOcean family. You should get the newsletter.

Categories: Designing, Others Tags:

Subscriptions Are Killing Creativity in Design: Why the Industry Must Break Free

November 27th, 2024 No comments

The graphic and web design world was once a sanctuary of creative freedom, where designers wielded their tools with boundless possibilities, limited only by imagination.

But now, a dark cloud looms over this vibrant industry: the relentless rise of subscription-based services. What was sold to us as a convenient, cost-effective model is now suffocating designers, stifling innovation, and forcing us into a perpetual cycle of dependence.

The Subscription Trap

In the past, owning design software was simple. You bought a product, installed it, and it was yours—forever. Upgrades were optional and came at your own pace.

Today, companies like Adobe, Figma, and countless others have restructured their models to lock designers into expensive monthly subscriptions. On the surface, it seems practical: always have the latest tools and updates. But this isn’t a fair trade; it’s a hostage situation.

The numbers tell the story. Adobe’s Creative Cloud subscription starts at $59.99 per month at the time of this writing for access to essential apps like Photoshop, Illustrator, and InDesign. Over five years, that’s almost a staggering $3,600. For freelancers and small studios, it’s a massive financial burden. And if you stop paying? You lose access to everything. All your files, all your tools—gone.

Creativity on a Clock

The subscription model doesn’t just hurt wallets; it punishes creativity. Deadlines and budgets are already stressful, but the looming threat of losing access to essential tools adds another layer of anxiety. Designers are forced into a “pay-to-play” reality where creativity is a service, not a skill. What happens to innovation when the tools of the trade become gated behind a recurring fee?

Even worse, many subscription services now bundle unrelated features into bloated plans, forcing designers to pay for tools they’ll never use. Want just Photoshop? Too bad. You’ll pay for the entire suite, even if you only need one or two applications. It’s the equivalent of being forced to buy a buffet ticket when all you want is a sandwich.

The New Monopoly on Design

Subscriptions also create a dangerous monopoly on creativity. Companies like Adobe, Figma, and Canva dominate the market, making it nearly impossible for independent or smaller competitors to offer alternatives. As designers, our ability to choose is eroding. The tools we use are dictated by industry standards, which are, in turn, dictated by these subscription giants.

When Figma announced its acquisition by Adobe, the collective gasp from designers worldwide wasn’t just about a business deal—it was about the future of affordable, accessible design tools. The writing is on the wall: consolidation and monopolization will leave designers with fewer options and higher costs.

Who Really Benefits?

It’s not the designers. It’s the corporations. Subscription models provide companies with predictable, recurring revenue streams, ensuring their financial security at the expense of their users. They’re no longer incentivized to create groundbreaking new tools; instead, they focus on incremental updates designed to justify the monthly fee. Meanwhile, designers are left paying more for less.

Breaking the Chains

The solution isn’t simple, but it starts with awareness and action. Designers must support alternatives to the subscription model. Open-source software like GIMP, Krita, and Inkscape offers viable, cost-effective options. Companies that still sell perpetual licenses, such as Affinity, deserve our support and advocacy.

Furthermore, we must collectively demand fairer pricing and licensing models. Why can’t companies offer modular subscriptions or rent-to-own options? Designers should be able to pay for the tools they need, not fund a corporation’s endless greed.

Conclusion: A Call to Arms

The graphic and web design community is one of resilience, creativity, and passion. But we cannot afford to let subscription models dictate our futures. It’s time to push back, explore alternatives, and reclaim the tools that allow us to create freely.

Subscriptions aren’t just killing our wallets—they’re killing the very essence of what it means to be a designer. Let’s break the cycle and rediscover the freedom to create.

Categories: Designing, Others Tags:

Follow Up: We Officially Have a CSS Logo!

November 26th, 2024 No comments

As a follow up to the search for a new CSS logo, it looks like we have a winner!

Since our last post, the color shifted away from a vibrant pink to a color with a remarkable history among the CSS community: rebeccapurple

CodePen Embed Fallback

With 400 votes on GitHub, I think the community has chosen well.

Check out Adam’s post on selecting the winner!


Follow Up: We Officially Have a CSS Logo! originally published on CSS-Tricks, which is part of the DigitalOcean family. You should get the newsletter.

Categories: Designing, Others Tags:

Exciting New Tools For Designers, November 2024

November 26th, 2024 No comments
001 4

Welcome to our latest toolkit collection.

As always, we’ve aimed for a range of apps, utilities, and services to help make life a little easier for designers, and for developers too. And, of course, what would a November collection be without some Thanksgiving images for our readers in the US? Enjoy!

Browser AI Kit

This web app lets you run some of the most popular AI tasks directly in your browser. There are currently three tools available, with potentially more coming.

Punch Back

Have you ever had a really exasperating client? Or are you sick of hearing the same complaints over and over again – make the logo bigger, I want a $10k site for five bucks, etc.? This will help relieve your feelings. No actual clients are harmed in the process of reducing your irritation.

ErrorPulse

ErrorPulse aims to simplify front-end error tracking with helpful features and a minimal dashboard. The free plan covering 5k error credits is an ample trial.

003 2

QuickPreview

QuickPreview lets you live test HTML in the browser, which could be really handy for fast prototyping or quick demos. Currently, any styles or scripts must be inline.

004 2

Stretch It

This easy-to-use little timer app sits on your macOS menu bar, and you just pull it down to set it. It automatically matches your system color scheme, and there is a range of alert sounds to choose from.

005 2

Quill

Quill clipboard manager automatically saves all your copied text and saves it in one place, where it can be easily accessed, organized, and searched.

006 2

Static.app

Static.app is a drag-and-drop hosting solution for static websites. The intuitive dashboard makes for easy editing and customization.

007 2

Revyme

Revyme is a web builder with a focus on animation. It allows non-frontend developers to create customized animations without code.

008 2

Cocolor

Test your color know-how with this game by clicking on the pigment buttons until the image matches the background. It’s not as easy as it sounds.

009 1 1

AutoEmailed

AutoEmailed allows you to automatically email a customer on completion of a successful Stripe payment for digital products.

010 2

Note This Down

This handy utility for Notion lets you upload a photo of handwritten notes, then it transcribes the text, and stores it in a page of your choosing.

011 1 1

Thanksgiving Icons

This set of seasonal images is bright and joyful. Although it is more general autumnal fruit and veg than turkey and pie, there are a couple of festive pilgrim hats.

012 2

Flux AI Lab

Flux AI Lab claims that its AI image generation models are superior to Dall-E and Midjourney. Its suite of tools will create realistic, animated, and illustrated styles, and offers consistency across image sets.

013 2

Onlook

Onlook is an open-source visual editor for React apps. It lets you design in your app and instantly writes all changes to code for you. Some technical knowledge is required.

014 2

Buddler

Buddler is a set of SEO tools for growth hacking. Its most recent addition is traffic audits of Google Search Console data.

015 2
Categories: Designing, Others Tags:

Alt Text: Not Always Needed

November 25th, 2024 No comments
Showing a caption describing an image located directly above it.

Alt text is one of those things in my muscle memory that pops up anytime I’m working with an image element. The attribute almost writes itself.

<img src="image.jpg" alt="">

Or if you use Emmet, that’s autocompleted for you. Don’t forget the alt text! Use it even if there’s no need for it, as an empty string is simply skipped by screen readers. That’s called “nulling” the alternative text and many screen readers simply announce the image file name. Just be sure it’s truly an empty string because even a space gets picked up by some assistive tech, which causes a screen reader to completely skip the image:

<!-- Not empty -->
<img src="image.jpg" alt=" ">

But wait… are there situations where an image doesn’t need alt text? I tend to agree with Eric that the vast majority of images are more than decorative and need to be described. Your images are probably not decorative and ought to be described with alt text.

Probably is doing a lot of lifting there because not all images are equal when it comes to content and context. Emma Cionca and Tanner Kohler have a fresh study on those situations where you probably don’t need alt. It’s a well-written and researched piece and I’m rounding up some nuggets from it.

What Users Need from Alt Text

It’s the same as what anyone else would need from an image: an easy path to accomplish basic tasks. A product image is a good example of that. Providing a visual smooths the path to purchasing because it’s context about what the item looks like and what to expect when you get it. Not providing an image almost adds friction to the experience if you have to stop and ask customer support basic questions about the size and color of that shirt you want.

So, yes. Describe that image in alt! But maybe “describe” isn’t the best wording because the article moves on to make the next point…

Quit Describing What Images Look Like

The article gets into a common trap that I’m all too guilty of, which is describing an image in a way that I find helpful. Or, as the article says, it’s a lot like I’m telling myself, “I’ll describe it in the alt text so screen-reader users can imagine what they aren’t seeing.”

That’s the wrong way of going about it. Getting back to the example of a product image, the article outlines how a screen reader might approach it:

For example, here’s how a screen-reader user might approach a product page:

  1. Jump between the page headers to get a sense of the page structure.
  2. Explore the details of a specific section with the heading label Product Description.
  3. Encounter an image and wonder “What information that I might have missed elsewhere does this image communicate about the product?

Interesting! Where I might encounter an image and evaluate it based on the text around it, a screen reader is already questioning what content has been missed around it. This passage is one I need to reflect on (emphasis mine):

Most of the time, screen-reader users don’t wonder what images look like. Instead, they want to know their purpose. (Exceptions to this rule might include websites presenting images, such as artwork, purely for visual enjoyment, or users who could previously see and have lost their sight.)

OK, so how in the heck do we know when an image needs describing? It feels so awkward making what’s ultimately a subjective decision. Even so, the article presents three questions to pose to ourselves to determine the best route.

  1. Is the image repetitive? Is the task-related information in the image also found elsewhere on the page?
  2. Is the image referential? Does the page copy directly reference the image?
  3. Is the image efficient? Could alt text help users more efficiently complete a task?

This is the meat of the article, so I’m gonna break those out.

Is the image repetitive?

Repetitive in the sense that the content around it is already doing a bang-up job painting a picture. If the image is already aptly “described” by content, then perhaps it’s possible to get away with nulling the alt attribute.

This is the figure the article uses to make the point (and, yes, I’m alt-ing it):

The caption for this image describes exactly what the image communicates. Therefore, any alt text for the image will be redundant and a waste of time for screen-reader users. In this case, the actual alt text was the same as the caption. Coming across the same information twice in a row feels even more confusing and unnecessary.

The happy path:

<img src="image.jpg" alt="">

But check this out this image about informal/semi-formal table setting showing how it is not described by the text around it (and, no, I’m not alt-ing it):

If I was to describe this image, I might get carried away describing the diagram and all the points outlined in the legend. If I can read all of that, then a screen reader should, too, right? Not exactly. I really appreciate the slew of examples provided in the article. A sampling:

  1. Bread plate and butter knife, located in the top left corner. 
  2. Dessert fork, placed horizontally at the top center. 
  3. Dessert spoon, placed horizontally at the top center, below the dessert fork.

That’s way less verbose than I would have gone. Talking about how long (or short) alt ought to be is another topic altogether.

Is the image referential?

The second image I dropped in that last section is a good example of a referential image because I directly referenced it in the content preceding it. I nulled the alt attribute because of that. But what I messed up is not making the image recognizable to screen readers. If the alt attribute is null, then the screen reader skips it. But the screen reader should still know it’s there even if it’s aptly described.

The happy path:

<img src="image.jpg" alt="">

Remember that a screen reader may announce the image’s file name. So maybe use that as an opportunity to both call out the image and briefly describe it. Again, we want the screen reader to announce the image if we make mention of it in the content around it. Simply skipping it may cause more confusion than clarity.

Is the image efficient?

My mind always goes to performance when I see the word efficient pop up in reference to images. But in this context the article means whether or not the image can help visitors efficiently complete a task.

If the image helps complete a task, say purchasing a product, then yes, the image needs alt text. But if the content surrounding it already does the job then we can leave it null (alt="") or skip it (alt=" ") if there’s no mention of it.

Wrapping up

I put a little demo together with some testing results from a few different screen readers to see how all of that shakes out.

CodePen Embed Fallback

Alt Text: Not Always Needed originally published on CSS-Tricks, which is part of the DigitalOcean family. You should get the newsletter.

Categories: Designing, Others Tags:

Top 25 Conferences and Events for Web Designers in 2025

November 25th, 2024 No comments

Here’s a curated list of 25 notable conferences and events in 2025 that web designers should consider:

1. Smashing Conf

Hosted by the team behind Smashing Magazine, SmashingConf offers two days of talks and workshops from industry leaders, focusing on practical takeaways for immediate application.

Dates: May 13–14, 2025
Location: San Francisco, California, USA
Website: https://smashingconf.com

2. Awwwards Conference

Celebrating creativity and innovation in web design, the Awwwards Conference attracts top digital designers and developers, featuring inspiring talks, workshops, and award ceremonies.

Dates: February 18–19, 2025
Location: Tokyo, Japan
Website: https://conference.awwwards.com

3. UXDX

Focusing on user experience, product design, and development, UXDX emphasizes end-to-end product delivery and collaboration among designers, developers, and product teams.

Dates: September 24–26, 2025
Location: Dublin, Ireland
Website: https://uxdx.com

4. An Event Apart

This traveling conference series offers intimate learning environments with sessions on CSS, responsive design, and accessibility, catering to those deeply invested in web design.

Dates: Multiple dates in 2025
Locations: Various cities across the USA
Website: https://aneventapart.com

5. CreativePro Week

Catering to graphic designers, web designers, and creative professionals, CreativePro Week offers sessions on branding, typography, and content creation, expanding skill sets beyond web design.

Dates: June 2–6, 2025
Location: Denver, Colorado, USA
Website: https://creativeproweek.com

6. Web Directions Summit

As a leading web design and development conference in the Asia-Pacific region, it features sessions on cutting-edge design techniques, front-end frameworks, and digital product strategies.

Dates: November 6–8, 2025
Location: Sydney, Australia
Website: https://webdirections.org

7. Config by Figma

Figma’s flagship conference celebrates all things design, offering insightful talks, live product demos, and a global community atmosphere.

Dates: August 12–13, 2025
Location: Virtual and In-person (San Francisco, USA)
Website: https://config.figma.com

8. Design Matters

Exploring the intersection of design, art, and technology, this boutique conference is ideal for creatives pushing the boundaries of web design.

Dates: October 1–2, 2025
Location: Copenhagen, Denmark
Website: https://designmatters.io

9. WordCamp Europe

As the flagship event for WordPress users, it offers sessions on themes, plugins, and web performance optimization, benefiting designers working with WordPress.

Dates: June 19–21, 2025
Location: Madrid, Spain
Website: https://europe.wordcamp.org/2025

10. Adobe MAX

Adobe MAX brings together professionals from graphic design, photography, video, and web design, featuring cutting-edge sessions and hands-on labs.

Dates: October 20–22, 2025
Location: Los Angeles, California, USA
Website: https://adobe.com/max

11. The UX Conference

Focusing on user experience and design strategy, it offers talks and workshops tailored to web designers aiming to deepen their understanding of UX principles.

Dates: March 11–12, 2025
Location: London, UK
Website: https://theuxconf.com

12. CSS Day

A highly focused conference for front-end developers and web designers, CSS Day delves deep into advanced CSS techniques, design systems, and browser quirks.

Dates: June 5–6, 2025
Location: Amsterdam, Netherlands
Website: https://cssday.nl

13. Interaction 25

Organized by the Interaction Design Association (IxDA), this global event focuses on interaction design, exploring the evolving role of designers in shaping the digital world.

Dates: February 3–7, 2025
Location: Vancouver, Canada
Website: https://interaction25.ixda.org

14. Frontend United

Bringing together front-end developers and designers, it offers sessions on the latest technologies, tools, and methodologies in web development and design.

Dates: May 15–16, 2025
Location: Ghent, Belgium
Website: https://frontendunited.org

15. OFFF Barcelona

A festival for the creative community, OFFF features workshops, conferences, and performances, inspiring web designers with innovative ideas and trends.

Dates: April 23–25, 2025
Location: Barcelona, Spain
Website: https://offf.barcelona

16. Web Summit

One of the largest tech conferences globally, Web Summit covers a wide range of topics, including web design, development, and digital marketing.

Dates: November 3–6, 2025
Location: Lisbon, Portugal
Website: https://websummit.com

17. FITC Toronto

Focusing on Future, Innovation, Technology, and Creativity, FITC Toronto offers sessions on design, development, and media, catering to web designers and developers.

Dates: April 27–29, 2025
Location: Toronto, Canada
Website: https://fitc.ca

18. Generate Conference

Organized by net magazine, Generate Conference offers practical advice and inspiration for web designers and developers, featuring leading industry speakers.

Dates: September 17–18, 2025
Location: London, UK
Website: https://generateconf.com

19. WebExpo Conference

WebExpo is a prominent event covering frontend and backend development, UX & UI design, AI, data, product research, digital marketing, and business. The 2025 conference offers 70 talks, free workshops, and mentor hours, providing a comprehensive learning experience for web professionals.

Dates: May 28–30, 2025
Location: Prague, Czech Republic
Website: https://webexpo.net

20. The Web Conference (WWW2025)

The Web Conference, formerly known as the International World Wide Web Conference, is an annual event focusing on the future directions of the World Wide Web. It provides a premier forum for discussion about the evolution of the web, standardization of its associated technologies, and their impact on society and culture.

Dates: April 28 – May 2, 2025
Location: Location to be announced
Website: https://www2025.thewebconf.org

21. UX360 Research Summit 2025

The UX360 Research Summit is a virtual conference focusing on UX and design research methods. Led by over 25 leading UX practitioners, the event covers planning, conducting, analyzing, and implementing UX insights through talks and interactive panel discussions.

Dates: February 19–20, 2025
Location: Virtual
Website: https://ux360summit.com/

22. Web Summit Vancouver

Web Summit Vancouver is set to be one of the world’s biggest tech conferences, bringing together thousands of international entrepreneurs, investors, media outlets, and leaders. This event marks Web Summit’s first foray into North America, continuing its mission to connect the global technology ecosystem.

Dates: May 27–30, 2025
Location: Vancouver, Canada
Website: https://vancouver.websummit.com

23. Adobe Summit – The Digital Experience Conference

Adobe Summit focuses on digital experiences, offering insights into the latest trends and technologies in digital marketing and customer experiences. Attendees can learn from global innovators, connect with peers, and be inspired by industry leaders.

Dates: March 17–20, 2025
Location: Las Vegas, Nevada, USA, and Online
Website: https://summit.adobe.com/na

24. International JavaScript Conference London

This conference is dedicated to JavaScript and its frameworks, offering sessions on the latest developments in JavaScript, web development, and software architecture. It’s ideal for web designers looking to enhance their coding skills and stay updated with industry trends.

Dates: May 12–15, 2025
Location: London, United Kingdom, and Online
Website: https://javascript-conference.com/london

25. World Design Congress 2025

The World Design Congress returns to London, bringing together representatives from various design disciplines, including architecture, communications, transport, and service design. The 2025 theme, “Design for Planet,” focuses on sustainable, circular, and repairable design solutions.

Dates: September 9–10, 2025
Location: London, United Kingdom
Website: https://www.designcouncil.org.uk

These events provide a unique opportunity to stay updated on the latest industry trends, tools, and technologies through workshops, keynote speeches, and hands-on sessions led by experts.

Conferences also foster invaluable networking opportunities, allowing attendees to connect with like-minded peers, potential clients, and industry leaders.

Check out some of these in 2025!

Categories: Designing, Others Tags:

How AI Tools Have Changed UX Writing: Balancing Innovation and the Human Touch

November 25th, 2024 No comments

The rise of AI tools has significantly influenced UX writing, transforming how we create and
transform user experiences. Companies leveraging AI in UX report marked improvements in
efficiency, with AI tools capable of reducing content production time by up to 50%. While AI can
handle routine tasks and improve scalability, the human touch in UX writing remains the key for
crafting easy-to-use, authentic, and emotionally resonant experiences that solve users’ pain
points.

At its core, UX writing involves selecting the right words to guide users through an interface
to achieve seamless interactions and instilling confidence along the way. It is a discipline where
brevity is key, and every word counts. Unlike other forms of writing, UX writing is not about
creativity in a traditional sense; it’s more about precision and clarity. Writers must design for
users who often skim content, and AI tools can assist in maintaining conciseness and clarity in
these quick-read contexts.

AI’s Role in Maintaining Voice and Tone

AI-powered UX writing tools have proven useful in maintaining consistent voice and tone across
large-scale projects or products. Voice refers to the consistent set of characteristics that shape
the personality of the product, while tone adjusts based on the user’s context and emotions.
These elements are key in building trust and creating memorable user experiences that will
be easily associated with your brand. AI excels at upholding these parameters across large
volumes of content, ensuring uniformity and also at reducing human errors in repetitive tasks
such as proofreading and translations.

The Limits of AI: Creativity and Flexibility

Still, we can argue that AI tools lack the flexibility and empathy that come with human input.
While it can process data quickly, AI tools struggle with capturing the subtle problems of specific
user groups and producing truly creative or original content. Moreover, UX professionals over-
relying on AI can lead to the loss of authenticity, and there’s a risk of creating content that
feels impersonal or even artificial. As users interact with your product, they need to feel
genuinely understood, and that is a task best handled by human writers who can intuitively tap
into emotions and context.

Effective UX writing is more than just giving your users clear instructions on how to perform their
tasks; it’s about recognizing key moments in a user’s journey. Although some AI tools can
even assist in identifying these moments by analyzing user behavior patterns, the UX writer is
still the one who makes the final decision on when and how to intervene. There’s a fine line
between helpful guidance and intrusive interaction, and human oversight ensures the experience
feels natural, rather than robotic or even forced.

The Risks of Relying on AI

AI tools can be very helpful when scaling content and ensuring consistency. Still, UX writers and
AI users must consider the risks.

The first major concern we’ll discuss is the ethical issues and biases that often accompany AI-
generated content. There is a lot of biased and stereotyped content out there. AI tools are
trained on existing content and consume it as grounding to create a response. These responses
can alienate users or perpetuate harmful stereotypes on the “garbage in, garbage out” principle.
This is one of the reasons human oversight is essential in identifying and rectifying these biases.

Additionally, AI tools can make mistakes and hallucinate. Those small-letter disclaimers in your
favorite tool are there for a reason. Make sure to double-check the correctness and apply
common sense. Blindly accepting AI-generated content, without proper review can even lead to
legal issues if the content misrepresents the product or violates guidelines.

Over-reliance on AI may also result in a loss of creativity and a decline in content quality. AI
tools cannot innovate beyond the data they are trained on, leading to repetitive writing that fails
to engage users on a deeper level. By picking speed and perceived efficiency over creativity and
original ideas, you’re risking authoring user experience that is dull and unmemorable. This will
ultimately hinder the product’s ability to connect with your audience.

Another significant risk is over-automation. The strategy of employing AI for automation may
result in losing the human-centered approach that makes UX writing effective. At the end of the
day, you’re not writing for machines; you’re writing for people. AI lacks the intuition needed to
fully understand the complexities of user emotions or motivations, which can result in content
that is too transactional or impersonal, leaving users feeling disconnected from the brand.

In the rush to implement AI solutions, companies may need to pay more attention to the real
user problems that UX writing aims to address. While AI can optimize word choice and
structure, it lacks a deep understanding of users’ needs and pain points. Even the most perfectly
generated content can miss the mark without this insight. This is where human writers excel.
They focus not only on what is written but also on why it is written and how it will resonate with
users. This balance is necessary for AI to create smooth user experiences that fail to forge
meaningful connections.

Example from Real Life: Balancing Precision and Empathy

At Syskit, we have a specific challenge when approaching UX writing that surely some of you will
relate to. Since we are developing a product that IT professionals and non-tech-savvy end-users
use, we need to be laser-focused on clarity. We are creating user experiences that must be
intuitive for users with varying degrees of IT skills.

While we leverage the efficiency of AI tools to streamline content consistency and handle
repetitive tasks, we remain committed to maintaining the human touch that makes our products
genuinely resonate with users. Our UX writing strategy is deeply rooted in understanding the
needs of our audiences. How do we do it? Dialogue with customers and constant testing. We
are collaborating with other teams on this, learning the exact phrasing our customers are using,
testing out the journey, gathering feedback, etc.

The ultimate goal is to craft messaging that guides users effortlessly through complex
interfaces and build trust, empathy, and a sense of connection with our brand. This balanced
approach allows us to scale without sacrificing the authenticity and precision that are core to our
values.

Conclusion: AI is just another tool

To sum up, AI has undoubtedly transformed how we approach UX writing by offering improved
efficiency. It should be seen as a tool assisting UX writers rather than replacing human creativity.
AI excels in tasks that require consistency, speed, and accuracy, such as proofreading,
maintaining voice, or generating multiple content variations at scale.

These tools free up time for UX writers to focus on more strategic, creative aspects of
content creation, allowing for deeper user engagement. However, it is crucial to remember that
AI works within the boundaries of the data it’s trained on, it lacks the emotional intelligence and
subtle understanding that human writers bring to the table. Crafting a user experience that feels
natural, empathetic, and aligned with human emotions requires more than algorithms.

The future of UX writing is not about choosing between AI and humans but about leveraging
the strengths of both. With AI handling routine tasks and scaling, writers are empowered to focus
on what they do best: crafting meaningful, user-centered experiences that machines cannot
replicate alone.

Featured image by Clark Young on Unsplash

Categories: Others Tags:

Figma Releases the Pattern Library

November 24th, 2024 No comments
423

Figma, the industry-leading design platform, has introduced a powerful new resource: the Figma Pattern Library.

This library offers a meticulously curated collection of reusable design patterns aimed at streamlining workflows, fostering collaboration, and enabling designers to produce consistent, high-quality interfaces.

In the evolving landscape of user interface design, consistency and scalability have become crucial for success. The Figma Pattern Library addresses these challenges by providing a centralized toolkit of UI components, making it easier for individuals and teams to maintain design uniformity while preserving creative flexibility.

The Need for a Pattern Library

As digital products grow increasingly complex, maintaining a cohesive design language across applications and platforms has become a significant challenge. Without a standardized approach, teams often face:

  • Fragmentation in Design: Inconsistent styles or mismatched components across screens can confuse users and undermine credibility.
  • Inefficiency in Workflow: Redesigning similar components from scratch wastes time and resources.
  • Difficulty Scaling Designs: As projects grow, it becomes harder to ensure consistency, especially for large teams or distributed collaborators.

The Figma Pattern Library solves these pain points by offering a comprehensive resource for creating and managing reusable design patterns.

What is the Figma Pattern Library?

The Figma Pattern Library is a pre-built collection of UI elements designed with best practices in usability, accessibility, and design systems. These elements include commonly used patterns such as buttons, forms, input fields, toggles, modals, and navigation menus.

But the library isn’t just a repository of design assets—it’s a strategic framework for designers. It serves as both a starting point for new projects and a guide for maintaining consistency in ongoing work.

Key Features and Benefits

A Comprehensive Collection of Patterns

The library includes a wide variety of essential UI components, all crafted with precision. Each component is built to reflect modern design principles, making it easy to implement designs that are both functional and aesthetically pleasing.

For instance, a designer creating a form can quickly pull a pre-designed input field from the library, confident that it meets usability and accessibility standards.

Accessibility-First Design

Accessibility is no longer optional; it’s a core requirement of modern digital design. The Figma Pattern Library is built with accessibility at its foundation, ensuring that components are optimized for all users, including those with disabilities.

Key accessibility features include:

  • Proper color contrast for readability.
  • Support for screen readers.
  • Keyboard navigation compatibility.

By prioritizing accessibility, the library helps designers create inclusive experiences without needing to reinvent the wheel.

2312

Flexibility Through Customization

While the library provides standardized patterns, it also supports customization. Designers can adapt patterns to align with specific brand guidelines, such as adjusting colors, typography, or spacing. This ensures that the patterns maintain consistency while reflecting a project’s unique identity.

For example, a team designing an e-commerce site for a luxury brand can adjust the button styles to reflect the brand’s premium feel, while still leveraging the base structure provided by the library.

Detailed Documentation and Guidelines

Each pattern in the library comes with thorough documentation. This includes:

  • Best practices for implementing the component.
  • Guidelines for when and how to use it.
  • Examples of its application in various contexts.

This documentation reduces the learning curve for new team members and ensures that patterns are applied correctly and consistently.

Seamless Integration Within Figma

One of the most significant advantages of the Figma Pattern Library is its direct integration with the Figma design platform. Designers can access the library without switching tools or disrupting their workflow.
This seamless integration allows for:

  • Quick drag-and-drop functionality to include patterns in projects.
  • Real-time collaboration, where team members can discuss and adapt patterns on the fly.
  • Immediate updates to patterns, ensuring everyone is working with the latest version.

Scalability for Complex Projects

The library is especially valuable for teams working on large-scale projects, such as multi-platform applications or enterprise systems. By providing a standardized set of patterns, it helps ensure that designs remain cohesive across dozens or even hundreds of screens.

Time-Saving and Efficient Workflow

The reusable nature of the patterns significantly reduces the time designers spend on repetitive tasks. Instead of creating similar components from scratch, designers can focus on solving complex problems and crafting creative solutions.

Why the Figma Pattern Library is a Game-Changer

The introduction of the Figma Pattern Library underscores a broader shift in the design industry toward systematization and efficiency. As more organizations adopt design systems to manage their digital products, resources like the Pattern Library become invaluable.

Here’s why the library stands out:

  • For Designers: It provides a foundation that enhances creativity by handling repetitive tasks.
  • For Teams: It fosters alignment and reduces friction in collaboration, especially for distributed or cross-functional teams.
  • For Organizations: It supports brand consistency and accelerates the delivery of high-quality digital products.

Practical Applications

The Figma Pattern Library is versatile and can be applied to a wide range of design scenarios:

  • Startups can use it to quickly build out their design systems and establish a cohesive visual language.
  • Large Enterprises can rely on it to manage consistency across diverse teams and products.
  • Freelance Designers can leverage it to save time on smaller projects while maintaining professional-quality outputs.

Conclusion

The Figma Pattern Library is more than just a collection of UI components—it’s a tool for elevating the design process. By providing reusable, accessible, and customizable patterns, it empowers designers to work more efficiently and collaboratively.

Figma Pattern Library

Categories: Designing, Others Tags:

Jaguar’s New Controversial Logo Unveiled

November 23rd, 2024 No comments
JAGUAR MONOGRAM ARTISTS MARK 16X9 191124

Early this week, Jaguar unveiled a comprehensive rebranding initiative as part of its strategic shift toward an all-electric future.

This transformation includes a new logo, updated branding elements, and a refreshed corporate identity, all designed to align with the company’s commitment to innovation and sustainability.

The redesigned logo features a mix of uppercase and lowercase letters, with the “J,” “G,” and “U” in uppercase, creating a distinctive visual identity. The iconic “leaper” emblem has been modernized and is now set against a backdrop of 16 horizontal lines, referred to as the “Strike Through.”

Additionally, a new monogram combining the letters “J” and “R” has been introduced, symbolizing the brand’s heritage and future aspirations.

2

Jaguar’s rebranding is guided by the creative philosophy of “Exuberant Modernism,” which emphasizes bold designs and original thinking. This approach aims to recapture the essence of Jaguar’s founding ethos to “Copy Nothing,” making the brand relevant to contemporary audiences.

The rebranding campaign includes a 30-second promotional video featuring diverse models in vibrant attire, set against a futuristic backdrop. Notably, the video does not showcase any vehicles, focusing instead on slogans such as “Copy Nothing” and “Create Exuberant.”

This avant-garde approach has sparked mixed reactions, with some critics questioning the absence of cars in the advertisement and others labeling the campaign as overly “woke.”

This rebranding aligns with Jaguar’s strategic plan to transition to an all-electric lineup by 2026. The company plans to launch three new electric vehicles, starting with a high-performance GT model.

The next phase of Jaguar’s transformation is set to be unveiled at Miami Art Week on December 2, 2024, where the company will debut its “Design Vision Concept,” providing further insights into its future design direction and product offerings.

Jaguar Website

Categories: Designing, Others Tags:

eBay Unveils the Evo Brand System Playbook

November 22nd, 2024 No comments
ZnYKwZbWFbowexsc typography I3751 55449 4786 15677.png

eBay has introduced the Evo Brand System Playbook, a robust, 280-page guide designed to unify and elevate its global brand identity while fostering flexibility across its markets.

The playbook, which reflects years of research and innovation, provides a detailed framework for eBay’s design and branding strategies, aiming to create a cohesive and user-friendly experience across all platforms.

What is the Evo Brand System Playbook?

The Evo Brand System Playbook is a dynamic resource that consolidates eBay’s brand identity into a single, accessible platform. It includes over 2,700 assets—spanning images, videos, and interactive tools—that guide teams in implementing eBay’s vision.

The playbook outlines principles for typography, iconography, colors, animations, and even custom illustrations, ensuring that eBay’s design philosophy remains consistent globally while adapting to regional nuances.

Core Features of the Evo Playbook

  1. Market Sans Typeface
    The proprietary typeface, Market Sans, is a key feature, delivering a clean and modern aesthetic that is both professional and approachable.
  2. Accessible Color Palette
    Evo introduces a fully accessible color system, ensuring inclusivity for all users, including those with visual impairments. The palette balances vibrancy with functionality, enhancing readability and user engagement.
  3. Custom Iconography and Illustrations
    A rich library of icons and bespoke illustrations helps create a distinct and engaging interface, reflecting eBay’s values of being smart, spirited, and dependable.
  4. Interactivity and Tools
    The Playbook offers tools for real-time exploration, such as the Color Playground, allowing users to experiment with combinations and designs before implementation.
  5. Accessibility-First Design
    Incorporating the Include accessibility annotations Figma plug-in, the Evo system underscores eBay’s commitment to making its platforms inclusive for diverse audiences.
232 1

Why Evo Matters

The Evo Brand System is more than a design guide—it’s a commitment to improving the user experience. By aligning its visual and interactive elements, eBay enhances the way customers interact with its platform, making online shopping intuitive and enjoyable.

According to eBay, the Playbook serves as a bridge between creativity and technical precision, ensuring that its teams and partners can deliver a seamless brand story across touchpoints. It also empowers flexibility, enabling regional teams to adapt global guidelines to local markets without losing the essence of the eBay brand.

The Future of Branding at eBay

The Evo Brand System Playbook marks a transformative chapter in eBay’s journey, reaffirming its leadership in e-commerce innovation. With this initiative, eBay aims to create a delightful, inclusive, and inspiring experience for its millions of users worldwide.

The Evo Playbook is now publicly accessible here, inviting designers, developers, and brand enthusiasts to explore eBay’s vision for the future of its brand.

Categories: Designing, Others Tags: