Looking to add an eye-catching touch to your website? NEAT is here to help!
This innovative tool allows you to create beautiful animated 3D gradients that can be used for hero backgrounds or other design elements on your site. It’s incredibly easy to use and gives you endless creative possibilities.
With NEAT, you can input between 1 and 5 colors of your choice. These colors are then processed through a Perlin noise function, which creates smooth transitions and unique patterns. The result? A visually captivating gradient with a natural, flowing feel that draws attention without overwhelming your design.
What’s more, NEAT is all about customization. You can tweak various settings, like noise intensity, blending modes, and color schemes, to craft the perfect gradient that fits your vision. Whether you’re a seasoned web designer or just starting out, this tool offers something for everyone.
So why stick with plain, static backgrounds when you can make your website stand out with dynamic, artistic gradients?
Try NEAT today and bring your design ideas to life in the most colorful way possible!
Web design is not merely an exercise in aesthetics; it is a blend of art and science where psychology plays a crucial role. Understanding how users think, perceive, and behave online allows designers to create interfaces that feel intuitive, trustworthy, and engaging.
By incorporating psychological principles, web designers can craft experiences that resonate with users, leading to higher engagement, better usability, and stronger brand loyalty. This article explores key psychological principles in web design, illustrated with examples.
The Principle of Visual Hierarchy
Visual hierarchy refers to the arrangement and prioritization of elements on a web page to guide users’ attention. The human brain naturally seeks order, and effective web design leverages this by organizing content in a way that aligns with users’ expectations.
For example, consider the homepage of a news website. Headlines are often displayed in bold, large fonts at the top of the page, while subheadings and body text are smaller and less prominent. This hierarchy directs the user’s focus to the most important content first. The New York Times website exemplifies this principle by combining size, color, and spacing to create a clear flow of information. Large headlines draw attention, images add visual weight, and subtle dividers delineate sections.
Gestalt Principles of Perception
Gestalt psychology explores how people perceive patterns and organize visual information. Several Gestalt principles are particularly relevant to web design, such as proximity, similarity, continuity, and closure.
For instance, the principle of proximity suggests that elements placed close together are perceived as related. Designers use this principle to group navigation links, making it clear that they are part of a cohesive menu. On e-commerce platforms like Amazon, products are grouped into categories with clear visual boundaries, helping users quickly identify relationships between items.
The principle of similarity, on the other hand, dictates that items sharing visual characteristics (e.g., color, shape, or size) are seen as part of the same group. This is why buttons across a website often share the same style—users instinctively recognize them as interactive elements.
Hick’s Law and Decision Making
Hick’s Law states that the time required to make a decision increases with the number of choices available. This principle highlights the importance of simplifying options to avoid overwhelming users.
A prime example of Hick’s Law in action is seen on landing pages for subscription services like Netflix. Instead of presenting a complex array of pricing plans and features upfront, Netflix streamlines the decision-making process by guiding users with a single call-to-action: “Start Your Free Trial.” Additional options and details are revealed incrementally, reducing cognitive load and encouraging users to proceed.
The Role of Color Psychology
Color profoundly influences user behavior and perception, evoking specific emotions and associations. In web design, color can enhance brand identity, guide attention, and affect user decisions.
For example, financial websites often use blue as a primary color due to its association with trust and stability. PayPal employs a blue-dominated color scheme, reinforcing its image as a reliable payment platform. Similarly, e-commerce websites like Amazon use vibrant orange or yellow for “Buy Now” buttons, leveraging these colors’ associations with urgency and positivity.
The Fitt’s Law and Interactive Design
Fitt’s Law states that the time to acquire a target is a function of the distance to and size of the target. In web design, this principle emphasizes the importance of designing buttons and interactive elements that are easy to locate and click.
Apple’s website demonstrates this principle effectively. Navigation menus are spacious, and clickable elements such as buttons and icons are large enough to ensure usability across devices. This is especially crucial in mobile design, where smaller screens demand thoughtful placement and sizing of touch targets.
Cognitive Load and Simplicity
Cognitive load refers to the amount of mental effort required to process information. Websites with cluttered designs or excessive text can overwhelm users, leading to frustration and abandonment.
Google’s homepage is an archetype of simplicity. With its iconic search bar as the focal point, the page minimizes distractions and allows users to focus entirely on their primary task: searching. By reducing cognitive load, Google ensures that users can interact with the site effortlessly.
The Power of Reciprocity in Persuasion
The principle of reciprocity, rooted in social psychology, suggests that people are inclined to return favors. In web design, this principle can be applied to foster goodwill and encourage user action.
HubSpot, for instance, offers free resources such as e-books, templates, and tools. This creates a sense of reciprocity, making users more likely to engage with their paid services in the future. By offering value upfront, the website establishes trust and fosters a positive relationship with its audience.
Trust and Social Proof
Trust is a cornerstone of user experience, and social proof is a powerful psychological mechanism for building it. Social proof includes user reviews, testimonials, ratings, and case studies, which reassure potential users by demonstrating that others have had positive experiences.
TripAdvisor exemplifies this principle by prominently displaying reviews and ratings for hotels, restaurants, and attractions. The volume and variety of reviews give users confidence in making informed decisions, reducing uncertainty.
Scarcity and Urgency in E-Commerce
Scarcity creates a sense of urgency, motivating users to act quickly. This psychological principle is frequently used in e-commerce to drive conversions.
Booking.com leverages scarcity by showing messages like “Only 2 rooms left!” or “10 people are looking at this hotel.” These prompts tap into the fear of missing out (FOMO), nudging users to complete their bookings promptly.
Anchoring Effect in Pricing Strategies
The anchoring effect occurs when people rely heavily on the first piece of information they encounter. In web design, this principle is often used in pricing tables.
For instance, SaaS companies like Adobe Creative Cloud display their premium plans alongside lower-cost alternatives. By showcasing the higher-priced option first, users perceive the subsequent, lower-priced plans as more affordable, even if they are still relatively expensive.
Conclusion
Incorporating psychological principles into web design is not just about making a website visually appealing; it’s about creating an intuitive and engaging experience that aligns with human behavior.
By understanding concepts like visual hierarchy, Gestalt principles, Hick’s Law, and cognitive load, designers can craft interfaces that are not only functional but also emotionally resonant.
The interplay between psychology and design ensures that websites are more than digital storefronts—they become meaningful spaces that cater to users’ needs and desires.
Online businesses must deal with fraud at one point or another, which is an unfortunate but inevitable event. Artificial Intelligence (AI) has become an additional and helpful tool to help online companies detect, prevent, and respond to fraudulent activities before they become a notable issue; this is a relief in the face of the overwhelming task of fraud prevention. By using AI capabilities, companies can be safer by gaining a considerable advantage against fraudsters.
How AI Changes the Game in Fraud Prevention
Let’s be truthful here; we all know that AI isn’t perfect, but it’s already been a real game-changer in a crime world, including the fraudulent one, being able to analyze large amounts of data in real-time, highlight various patterns of possible fraud in seconds, and continuously improve via machine learning, with our help of course.
What is more important, unlike in the case of other systems, AI fraud detection tools develop an understanding of the common marks of fraudulent activities, unlike traditional systems, which can take 10 times more time to understand what is going on.
Pattern Recognition
AI does very well in environments where pattern recognition is needed the most. Fraudulent behavior often follows a pattern, such as a strange location for an account login when using a VPN (Virtual Private Network) or several quick purchases over a short period. AI algorithms can detect these anomalies and flag them for review.
Advanced machine learning models are far superior in fraud versus non-fraud transaction classification. For example, an AI system can analyze historical customer behavior to uncover outliers that may remain unseen by a human analyst or other traditional systems. The more the models learn about legitimate behavior, the more accurately they pinpoint unusual activities.
Speed and Efficiency
AI-powered fraud prevention tools work in real-time, meaning they can analyze transactions when they occur; it is essential to mitigate risks since fraudsters often act quickly to move funds before any detection to which they gain access.
It also uses AI to drastically reduce the load on human fraud analysts (yes, AI can’t be left alone, not yet) by filtering out all legitimate transactions and highlighting only those that need further investigation. In this way, it keeps online businesses running smoothly while also improving their security.
Adaptive Learning
The ability to adapt over time makes AI and machine learning different. While fraudsters continue evolving tactics to bypass existing security, AI tools are not stationary – they learn and improve as they encounter new data. Such adaptability puts AI-powered systems in an excellent position to match and often outpace evolving fraud techniques, providing a sense of security in the face of changing threats.
Best Practices for AI Implementation in Fraud Prevention
With high-quality data investment, an online business would be on the right path to capitalize on AI to prevent fraud. Since AI systems learn from training data (that makes users the teachers), they need appropriate inputs for accurate outputs. Therefore, precise and comprehensive data collection will be prime significance for a business.
The Know Your Customer (KYC) process is essential to financial institutions, so it is equally important that online businesses should have a Know Your Business (KYB) policy as well. That means deep knowledge of the unique risk factors relating to your company; businesses could develop such insights with the help of AI. It pays off in the long term, making the most of AI-based solutions, so investing in comprehensive data collection and regular audits is always recommended.
AI-Powered Fraud Prevention: The Advantages for Online Businesses
AI has several unique advantages over traditional fraud prevention methods. Let’s look at the significant benefits that AI can provide to online businesses.
Increased Accuracy
The ability of AI to detect fraud far outstrips traditional rule-based systems. Machine learning models process large volumes of data, learning to flag even the slightest hints of fraud that human analysts could overlook because all humans can make mistakes. By reducing false positive instances where legitimate transactions are fraudulent, AI helps ensure that legitimate customers are not disturbed.
Reduced Operational Costs
Fraud prevention can be quite resource-intensive, so here comes AI to help by reducing operation costs by automating processes and reducing the number of Fraud Prevention team members an organization needs to deal with. This efficiency in cost reduction can be a relief for many businesses. Human intervention will likely always be required; as we said before, AI can’t be left alone for any given case.
Scalability
Another point might be that rapid growth is quite challenging for businesses dealing with scaling fraud prevention systems. AI-based fraud detection systems are fully scalable; with an increased transaction volume, their efficiencies remain uncompromised, making perfect sense for e-commerce platforms that show seasonal sales spikes.
Steps of online businesses in integrating AI against fraud
If you are an e-commerce business looking to implement AI in fraud prevention, you should take several steps, which are:
Assess Your Current Fraud Prevention Strategy
First, start by identifying the weaknesses in your current fraud prevention processes: Are you experiencing false positives? Is your team sinking in manual reviews? Such questions will give you an idea of what AI tools will most benefit business.
Partner with a reliable AI vendor
The key is choosing the right vendor; you should look at an AI solution with extensive experience in fraud prevention within one’s industry and a track record of success. Also, ensure the vendor provides flexible APIs, allowing different software systems and various programs to communicate for easy integration with your current systems.
Train and Implement
Most AI-based solutions for fraud prevention do need some upfront training in the form of tuning their algorithms. Work closely with your provider to feed them historical data from which the business could use AI. Test the system in a controlled environment before fully deploying it.
Monitor and Optimise
You must constantly monitor your AI system’s performance after it has been fully deployed on your systems. Fraud detection is not a set-and-forget job; periodically check and adjust your system’s accuracy as necessary, with changing fraud patterns and evolving business needs.
Challenges to Note in Using AI for Fraud Prevention
It is important to note that AI has enormous advantages, but it is not all smooth sailing. It could be hard for smaller businesses with limited resources to implement AI, but if you want to, it is very achievable.
Businesses should consider GDPR laws while deploying AI systems because compliance is a huge factor, especially concerning sensitive customer data. It should be noted that businesses must partner with AI providers that understand how data is being used and protected.
AI in Future Fraud Prevention
Improvements in AI and machine learning will dominate the future of fraud prevention; as various technologies improve, fraud detection tools will be more accurate, responsive, and user-friendly. We will see more businesses using AI for proactive threat assessment, such as fraud prediction, even before it happens.
AI will also be of prime importance in more extensive verification processes such as KYB and KYC, thus helping organizations do much more than fraud detection by building trust in customers and partners. AI is not a short-term trend in fraud prevention but a long-term solution with continuous evolvement depending on the nature of fraud.
Conclusion
Fraud is a constant plague in the modern world, but AI gives online businesses new and powerful tools to fight it. Companies can minimize their risks using AI for anomaly detection, real-time analysis, and KYB verification while improving accuracy, efficiency, precision, and better customer experience.
If your online business has not shifted to improve fraud prevention, now is the time to implement AI in your strategy. It would be an investment worth making with peace of mind, having an edge over the competitors in this rapid digital world.
Around 75% of consumers lean into the email channel for both promotional and transactional updates from brands.
But it is even more preferred when the holiday season hits.
94.4% of consumers say they need transactional messages during this busy time, and 79.8% are willing to receive personalized promotions.
The point is : Email is the channel consumers rely on to find the best Black Friday deals and holiday offers. Great news for brands and marketers who love the low cost of sending emails.
But it also means that they are under ever-greater pressure to ensure that the right messages reach the inbox at the right time to holiday shoppers who are busier (and choosier) than ever.
With promotions and seasonal deals piling up in email inboxes faster than snow on a December morning, your emails will have to connect with your audience on a 1:1 level.
Email personalization using dynamic content is your chance to nail that.
Dynamic email content offers countless opportunities to reach out to your subscribers with heartfelt, personalized messaging that spreads cheer this holiday season. It can also help you nurture more leads by being at the forefront of their minds when they think about their holiday shopping needs.
So, if you are looking for clever ways to use dynamic email content for holiday email marketing, you’ve come to the right place.
But before we peel back the layers of different ways you can use dynamic content in holiday emails, let’s get over what dynamic content is.
What Is Dynamic Email Content?
Say you have a prospect named Ken. Ken hasn’t bought anything from you yet, but thanks to the dashboard full of data, you know that Ken has been interested in product X.
Now, here’s my question—
Would you send Ken a yawn-inducing welcome email full of generic deals on unrelated items? Or would you personalize the welcome email with some irresistible deals on the very product he’s interested in?
If you are serious about delivering personalized (read: relevant) shopping experiences that holiday shoppers want, you will find ways to fine-tune your holiday email campaigns based on who opens them, right?
Dynamic email content is a personalized email element that changes based on subscriber behavior, interests, or order history.
This could mean that the highlighted products in an upcoming sale change based on the recipient’s interest, visual content that adapts images based on the customer’s gender preferences, or email design that adjusts based on the local weather, customs, or cultural contexts.
While basic email personalization, like including the recipient’s name in the subject line, plays a part in holiday email personalization, dynamic email content takes things further. It allows marketers to draw on subscribers’ data and behavior and send one campaign, one time, that is optimized and targeted to every individual.
Meaning that the entire email doesn’t have to be unique for every single customer. Only some aspects should be unique and according to the individual subscriber’s preferences.
Top 6 Ways to Energize Your Holiday Email Campaigns With Dynamic Content
Some ways to use dynamic content and make your holiday email flows more effective are:
Countdown Timers
Countdown timers in holiday emails tell your subscribers that the holiday deal is about to disappear or that prices are going to soar after the timer stops.
Each time a customer opens the email, they get a real-time reminder of the remaining days, hours, minutes, and seconds. Not to mention the adrenaline rush and the FOMO of losing out on your brand’s special offer. It pushes them to grab it before it’s too late.
And it’s not just urgency or scarcity, either. Including countdown timers in your holiday email campaigns helps customers plan better during the busy holiday season.
However, they are also a matter of trust. So, if the email says the offer ends today, it should end today.
Recommended or Popular Gift Ideas
The more personalized your gift recommendations, the more your customers will spend this holiday season.
One of the simplest and most effective ways to do this is to send dynamic emails that suggest tailored or popular gifts.
They are game-changers, especially during the holiday season. Your subscribers are looking for gift ideas. Show them personalized gift ideas based on their past purchases and browsing behavior. Better yet, curate a gift guide with popular items, best sellers, or bundle products frequently bought together.
By using product feeds and audience segmentation, you can help shoppers find the perfect gift for loved ones.
Loyalty Programs Update
Loyal customers are worth celebrating. We all count on them to keep sales rolling in, don’t we?
But to keep them choosing your brand, especially during the holiday season, you must keep them delighted with rewards. Otherwise, they might just wander off to competitors offering attractive holiday deals.
And if dynamic loyalty programs don’t do that for you, what will?
Holiday emails with dynamic loyalty points show subscribers how many points they’ve earned after their past purchases. They remind them of the rewards they can unlock using these points—exclusive discounts, festive offers, free gifts, or limited-time perks.
Each point reflects their engagement with your brand, like purchases, referrals, or other meaningful interactions. And wouldn’t you be thrilled at the prospect of redeeming points right when you’re ready to shop? Likewise, for your subscribers.
The excitement of redeeming points nudges them closer to completing their shopping carts and makes them feel valued. They, after all, have stuck with you through thick and thin.
Just remember, this holiday email marketing strategy only works if your campaigns have been consistently rewarding loyal customers all year round.
Shipping and Order Tracking
You have worked so hard to create and market an amazing product. I am sure you don’t want a poor post-purchase experience to be the only thing your customers remember about your brand.
So, remember this–
Holiday shoppers shouldn’t have to refresh the tracking page repeatedly and wonder if their holiday gift will arrive on time.
With dynamic shipment and order tracking emails, real-time tracking information is embedded in the email itself. They update customers conveniently to check their order status without clicking through a separate page. Instead of a static delivery estimate that we normally see in standard emails, these emails feature a live tracking graphic that keeps updating in real time.
It’s a straightforward yet effective way to add a touch of reassurance to your holiday emails that their holiday gifts are en route so that they have one less thing to worry about.
Product Stock Updates
Having your heart set on a gift, only to find it sold out, is not a good feeling for holiday shoppers and your brand.
Try dynamic emails this festive season and display up-to-date stock updates. This will tell customers when they have to rush to place an order. This very helpful urgency will help you drive purchases before the must-haves are gone.
Take it a notch up and remove products from the emails that are running low on stock. This will prevent shoppers from getting frustrated by sold-out products.
Nothing’s worse for holiday shoppers (or your brand) than a customer who has their heart set on a gift only to find it sold out.
Try dynamic emails with real-time stock updates to save yourself from embarrassment. These dynamic elements are a must for your holiday email workflows as they tell customers when to rush to place an order. The undertones of urgency drive purchases before the must-haves disappear.
Take it a notch further by automatically removing low-stock items from emails. This spares shoppers the disappointment of seeing unavailable products, keeping their holiday spirits intact.
Geolocation
The logic here is sound: know your subscribers’ geographic locations and send them targeted, location-based content that resonates with their specific region.
That’s much more meaningful than saying, “Get cozy with our hot cocoa gift set!” to someone planning a beach barbecue.
By asking for a subscriber’s zip code when they sign up, you have the chance to deliver location-based offers, time-zone-specific sends, and even maps to the nearest store.
Another holiday email marketing strategy that makes sense for global brands is tailoring email visuals and messaging using geolocation.
For instance, traditional winter themes could be used for subscribers in the Northern Hemisphere and sunny, beachy themes for subscribers down South.
Such emails make for an engaging and relevant holiday shopping experience because they tailor content and design to match the local climate, making it uniquely suited to individual subscribers.
Wrapping Up
It should be pretty clear by now that dynamic email elements are a truly unique feature that you can use in your holiday email marketing campaigns. They make your email design stand out from your competition and trigger a feeling of urgency in your subscribers.
Sure, crafting dynamic email campaigns for the holiday season takes time, creativity, and planning. But it is worth every bit of effort.
Just be sure to thoroughly test your emails to catch any potential rendering issues and ensure they reach your audience’s inbox without a hitch.
Minimalism continues to be a dominant trend among well designed websites, but it is clear that minimal does not mean visually dull. Minimalist design can incorporate color, animation, and even decorative fonts, as long as restraint is exercised.
On the other hand, a strong site architecture with a clear and robust structure can convey a sense of simplicity, even if the visual design is more elaborate. When content is organized, users will feel more comfortable navigating the site. Enjoy!
Vibrant, characterful illustrations help bring to life this collection of oral testimonies from over 200 elders, including activists and community builders, who witnessed and helped shape change in American society.
This portfolio site for Hugmun creative studio makes clever use of a central slideshow to create a structure that can present plenty of content for an individual project while keeping others within easy reach.
Emergence Magazine is a magazine and creative studio that explores the connections between ecology, culture, and spirituality through storytelling and art across various mediums. Interviews and essays sit alongside films and immersive web experiences on a calm, unobtrusive backdrop.
This interactive experience from the RSPCA (Royal Society for the Protection of Animals) explores the impact that technology, climate change, political decisions, and even our dietary choices will have on the future. The illustration style is friendly without being too cutesy, and the gamified format allows information to be presented in digestible chunks.
The minimalist design of Duten’s website reflects the minimalist style of its product range. Considered animation effects add a layer of sophistication.
Lifeworld is an artwork by Olafur Eliasson for WeTransfer as guest curator of its artist platform. The use of black and white and the irregular grid layout creates drama and an interesting rhythm.
This site for Gelato La Boca is bright with a fun, almost comic-book feel. The color scheme is actually quite minimal, but because of how the colors are used, it seems like more.
This is an appealingly minimalist site. Several design elements, such as the product details and customization boxes, and the display type, reflect the style of the products sold.
Skillbard has recently rebranded, and this website is part of that new brand identity. It has a sense of playfulness about it, with wiggly and animated type and a color scheme that changes randomly.
HUWD is a new platform for challenging how technology is developed and deployed with the aim of adopting a more thoughtful approach. The logotype has a deliberate liquidness, and the occasional color gradients give an ethereal feel.
The clever landing page concept of a contact sheet with magnifier piques the user’s interest before leading to a well-organized, easy to navigate agency portfolio.
This portfolio site for creative agency Otherlife focuses almost entirely on case studies. These are well presented with plenty of images and concise supporting text. The agency’s own branding is minimal and avoids intruding.
Docky is an Airbnb type platform connecting boat owners with berths. This supporting website splits into two to cover renter and rentee services separately. Animation and simple illustration add depth.
Watchmaker Omega is promoting its support of the ClearSpace project to remove manmade debris from space. Animation and illustration combine to create an impactful and informative experience.
The mission: Provide a dashboard within the WordPress admin area for browsing Google Analytics data for all your blogs.
The catch? You’ve got about 900 live blogs, spread across about 25 WordPress multisite instances. Some instances have just one blog, others have as many as 250. In other words, what you need is to compress a data set that normally takes a very long time to compile into a single user-friendly screen.
The implementation details are entirely up to you, but the final result should look like this Figma comp:
I want to walk you through my approach and some of the interesting challenges I faced coming up with it, as well as the occasional nitty-gritty detail in between. I’ll cover topics like the WordPress REST API, choosing between a JavaScript or PHP approach, rate/time limits in production web environments, security, custom database design — and even a touch of AI. But first, a little orientation.
Let’s define some terms
We’re about to cover a lot of ground, so it’s worth spending a couple of moments reviewing some key terms we’ll be using throughout this post.
What is WordPress multisite?
WordPress Multisite is a feature of WordPress core — no plugins required — whereby you can run multiple blogs (or websites, or stores, or what have you) from a single WordPress installation. All the blogs share the same WordPress core files, wp-content folder, and MySQL database. However, each blog gets its own folder within wp-content/uploads for its uploaded media, and its own set of database tables for its posts, categories, options, etc. Users can be members of some or all blogs within the multisite installation.
What is WordPress multi-multisite?
It’s just a nickname for managing multiple instances of WordPress multisite. It can get messy to have different customers share one multisite instance, so I prefer to break it up so that each customer has their own multisite, but they can have many blogs within their multisite.
So that’s different from a “Network of Networks”?
It’s apparently possible to run multiple instances of WordPress multisite against the same WordPress core installation. I’ve never looked into this, but I recall hearing about it over the years. I’ve heard the term “Network of Networks” and I like it, but that is not the scenario I’m covering in this article.
Why do you keep saying “blogs”? Do people still blog?
You betcha! And people read them, too. You’re reading one right now. Hence, the need for a robust analytics solution. But this article could just as easily be about any sort of WordPress site. I happen to be dealing with blogs, and the word “blog” is a concise way to express “a subsite within a WordPress multisite instance”.
One more thing: In this article, I’ll use the term dashboard site to refer to the site from which I observe the compiled analytics data. I’ll use the term client sites to refer to the 25 multisites I pull data from.
My implementation
My strategy was to write one WordPress plugin that is installed on all 25 client sites, as well as on the dashboard site. The plugin serves two purposes:
Expose data at API endpoints of the client sites
Scrape the data from the client sites from the dashboard site, cache it in the database, and display it in a dashboard.
The WordPress REST API is the Backbone
The WordPress REST API is my favorite part of WordPress. Out of the box, WordPress exposes default WordPress stuff like posts, authors, comments, media files, etc., via the WordPress REST API. You can see an example of this by navigating to /wp-json from any WordPress site, including CSS-Tricks. Here’s the REST API root for the WordPress Developer Resources site:
What’s so great about this? WordPress ships with everything developers need to extend the WordPress REST API and publish custom endpoints. Exposing data via an API endpoint is a fantastic way to share it with other websites that need to consume it, and that’s exactly what I did:
We don’t need to get into every endpoint’s details, but I want to highlight one thing. First, I provided a function that returns all my endpoints in an array. Next, I wrote a function to loop through the array and register each array member as a WordPress REST API endpoint. Rather than doing both steps in one function, this decoupling allows me to easily retrieve the array of endpoints in other parts of my plugin to do other interesting things with them, such as exposing them to JavaScript. More on that shortly.
Once registered, the custom API endpoints are observable in an ordinary web browser like in the example above, or via purpose-built tools for API work, such as Postman:
PHP vs. JavaScript
I tend to prefer writing applications in PHP whenever possible, as opposed to JavaScript, and executing logic on the server, as nature intended, rather than in the browser. So, what would that look like on this project?
On the dashboard site, upon some event, such as the user clicking a “refresh data” button or perhaps a cron job, the server would make an HTTP request to each of the 25 multisite installs.
Each multisite install would query all of its blogs and consolidate its analytics data into one response per multisite.
Unfortunately, this strategy falls apart for a couple of reasons:
PHP operates synchronously, meaning you wait for one line of code to execute before moving to the next. This means that we’d be waiting for all 25 multisites to respond in series. That’s sub-optimal.
My production environment has a max execution limit of 60 seconds, and some of my multisites contain hundreds of blogs. Querying their analytics data takes a second or two per blog.
Damn. I had no choice but to swallow hard and commit to writing the application logic in JavaScript. Not my favorite, but an eerily elegant solution for this case:
Due to the asynchronous nature of JavaScript, it pings all 25 Multisites at once.
The endpoint on each Multisite returns a list of all the blogs on that Multisite.
The JavaScript compiles that list of blogs and (sort of) pings all 900 at once.
All 900 blogs take about one-to-two seconds to respond concurrently.
Holy cow, it just went from this:
( 1 second per Multisite * 25 installs ) + ( 1 second per blog * 900 blogs ) = roughly 925 seconds to scrape all the data.
To this:
1 second for all the Multisites at once + 1 second for all 900 blogs at once = roughly 2 seconds to scrape all the data.
That is, in theory. In practice, two factors enforce a delay:
Browsers have a limit as to how many concurrent HTTP requests they will allow, both per domain and regardless of domain. I’m having trouble finding documentation on what those limits are. Based on observing the network panel in Chrome while working on this, I’d say it’s about 50-100.
Web hosts have a limit on how many requests they can handle within a given period, both per IP address and overall. I was frequently getting a “429; Too Many Requests” response from my production environment, so I introduced a delay of 150 milliseconds between requests. They still operate concurrently, it’s just that they’re forced to wait 150ms per blog. Maybe “stagger” is a better word than “wait” in this context:
Open the code
async function getBlogsDetails(blogs) {
let promises = [];
// Iterate and set timeouts to stagger requests by 100ms each
blogs.forEach((blog, index) => {
if (typeof blog.url === 'undefined') {
return;
}
let id = blog.id;
const url = blog.url + '/' + blogDetailsEnpointPath + '?uncache=' + getRandomInt();
// Create a promise that resolves after 150ms delay per blog index
const delayedPromise = new Promise(resolve => {
setTimeout(async () => {
try {
const blogResult = await fetchBlogDetails(url, id);
if( typeof blogResult.urls == 'undefined' ) {
console.error( url, id, blogResult );
} else if( ! blogResult.urls ) {
console.error( blogResult );
} else if( blogResult.urls.length == 0 ) {
console.error( blogResult );
} else {
console.log( blogResult );
}
resolve(blogResult);
} catch (error) {
console.error(`Error fetching details for blog ID ${id}:`, error);
resolve(null); // Resolve with null to handle errors gracefully
}
}, index * 150); // Offset each request by 100ms
});
promises.push(delayedPromise);
});
// Wait for all requests to complete
const blogsResults = await Promise.all(promises);
// Filter out any null results in case of caught errors
return blogsResults.filter(result => result !== null);
}
With these limitations factored in, I found that it takes about 170 seconds to scrape all 900 blogs. This is acceptable because I cache the results, meaning the user only has to wait once at the start of each work session.
The result of all this madness — this incredible barrage of Ajax calls, is just plain fun to watch:
PHP and JavaScript: Connecting the dots
I registered my endpoints in PHP and called them in JavaScript. Merging these two worlds is often an annoying and bug-prone part of any project. To make it as easy as possible, I use wp_localize_script():
When you do, take my endpoint URLs, bundle them up as JSON, and inject them into the HTML document as a global variable for my JavaScript to read. This is leveraging the point I noted earlier where I took care to provide a convenient function for defining the endpoint URLs, which other functions can then invoke without fear of causing any side effects.
Here’s how that ended up looking:
Auth: Fort Knox or Sandbox?
We need to talk about authentication. To what degree do these endpoints need to be protected by server-side logic? Although exposing analytics data is not nearly as sensitive as, say, user passwords, I’d prefer to keep things reasonably locked up. Also, since some of these endpoints perform a lot of database queries and Google Analytics API calls, it’d be weird to sit here and be vulnerable to weirdos who might want to overload my database or Google Analytics rate limits.
That’s why I registered an application password on each of the 25 client sites. Using an app password in php is quite simple. You can authenticate the HTTP requests just like any basic authentication scheme.
I’m using JavaScript, so I had to localize them first, as described in the previous section. With that in place, I was able to append these credentials when making an Ajax call:
async function fetchBlogsOfInstall(url, id) {
let install = lexblog_network_analytics.installs[id];
let pw = install.pw;
let user = install.user;
// Create a Basic Auth token
let token = btoa(`${user}:${pw}`);
let auth = {
'Authorization': `Basic ${token}`
};
try {
let data = await $.ajax({
url: url,
method: 'GET',
dataType: 'json',
headers: auth
});
return data;
} catch (error) {
console.error('Request failed:', error);
return [];
}
}
That file uses this cool function called btoa() for turning the raw username and password combo into basic authentication.
The part where we say, “Oh Right, CORS.”
Whenever I have a project where Ajax calls are flying around all over the place, working reasonably well in my local environment, I always have a brief moment of panic when I try it on a real website, only to get errors like this:
Oh. Right. CORS. Most reasonably secure websites do not allow other websites to make arbitrary Ajax requests. In this project, I absolutely do need the Dashboard Site to make many Ajax calls to the 25 client sites, so I have to tell the client sites to allow CORS:
<?php
// ...
function __construct() {
add_action( 'rest_api_init', array( $this, 'maybe_add_cors_headers' ), 10 );
}
function maybe_add_cors_headers() {
// Only allow CORS for the endpoints that pertain to this plugin.
if( $this->is_dba() ) {
add_filter( 'rest_pre_serve_request', array( $this, 'send_cors_headers' ), 10, 2 );
}
}
function is_dba() {
$url = $this->get_current_url();
$ep_urls = $this->get_endpoint_urls();
$out = in_array( $url, $ep_urls );
return $out;
}
function send_cors_headers( $served, $result ) {
// Only allow CORS from the dashboard site.
$dashboard_site_url = $this->get_dashboard_site_url();
header( "Access-Control-Allow-Origin: $dashboard_site_url" );
header( 'Access-Control-Allow-Headers: Origin, X-Requested-With, Content-Type, Accept, Authorization' );
header( 'Access-Control-Allow-Methods: GET, OPTIONS' );
return $served;
}
[...]
}
You’ll note that I’m following the principle of least privilege by taking steps to only allow CORS where it’s necessary.
Auth, Part 2: I’ve been known to auth myself
I authenticated an Ajax call from the dashboard site to the client sites. I registered some logic on all the client sites to allow the request to pass CORS. But then, back on the dashboard site, I had to get that response from the browser to the server.
The answer, again, was to make an Ajax call to the WordPress REST API endpoint for storing the data. But since this was an actual database write, not merely a read, it was more important than ever to authenticate. I did this by requiring that the current user be logged into WordPress and possess sufficient privileges. But how would the browser know about this?
In PHP, when registering our endpoints, we provide a permissions callback to make sure the current user is an admin:
JavaScript can use this — it’s able to identify the current user — because, once again, that data is localized. The current user is represented by their nonce:
async function insertBlog( data ) {
let url = lexblog_network_analytics.endpoint_urls.insert_blog;
try {
await $.ajax({
url: url,
method: 'POST',
dataType: 'json',
data: data,
headers: {
'X-WP-Nonce': getNonce()
}
});
} catch (error) {
console.error('Failed to store blogs:', error);
}
}
function getNonce() {
if( typeof wpApiSettings.nonce == 'undefined' ) { return false; }
return wpApiSettings.nonce;
}
The wpApiSettings.nonce global variable is automatically present in all WordPress admin screens. I didn’t have to localize that. WordPress core did it for me.
Cache is King
Compressing the Google Analytics data from 900 domains into a three-minute loading .gif is decent, but it would be totally unacceptable to have to wait for that long multiple times per work session. Therefore I cache the results of all 25 client sites in the database of the dashboard site.
I’ve written before about using the WordPress Transients API for caching data, and I could have used it on this project. However, something about the tremendous volume of data and the complexity implied within the Figma design made me consider a different approach. I like the saying, “The wider the base, the higher the peak,” and it applies here. Given that the user needs to query and sort the data by date, author, and metadata, I think stashing everything into a single database cell — which is what a transient is — would feel a little claustrophobic. Instead, I dialed up E.F. Codd and used a relational database model via custom tables:
It’s been years since I’ve paged through Larry Ullman’s career-defining (as in, my career) books on database design, but I came into this project with a general idea of what a good architecture would look like. As for the specific details — things like column types — I foresaw a lot of Stack Overflow time in my future. Fortunately, LLMs love MySQL and I was able to scaffold out my requirements using DocBlocks and let Sam Altman fill in the blanks:
Open the code
<?php
/**
* Provides the SQL code for creating the Blogs table. It has columns for:
* - ID: The ID for the blog. This should just autoincrement and is the primary key.
* - name: The name of the blog. Required.
* - slug: A machine-friendly version of the blog name. Required.
* - url: The url of the blog. Required.
* - mapped_domain: The vanity domain name of the blog. Optional.
* - install: The name of the Multisite install where this blog was scraped from. Required.
* - registered: The date on which this blog began publishing posts. Optional.
* - firm_id: The ID of the firm that publishes this blog. This will be used as a foreign key to relate to the Firms table. Optional.
* - practice_area_id: The ID of the firm that publishes this blog. This will be used as a foreign key to relate to the PracticeAreas table. Optional.
* - amlaw: Either a 0 or a 1, to indicate if the blog comes from an AmLaw firm. Required.
* - subscriber_count: The number of email subscribers for this blog. Optional.
* - day_view_count: The number of views for this blog today. Optional.
* - week_view_count: The number of views for this blog this week. Optional.
* - month_view_count: The number of views for this blog this month. Optional.
* - year_view_count: The number of views for this blog this year. Optional.
*
* @return string The SQL for generating the blogs table.
*/
function get_blogs_table_sql() {
$slug = 'blogs';
$out = "CREATE TABLE {$this->get_prefix()}_$slug (
id BIGINT NOT NULL AUTO_INCREMENT,
slug VARCHAR(255) NOT NULL,
name VARCHAR(255) NOT NULL,
url VARCHAR(255) NOT NULL UNIQUE, /* adding unique constraint */
mapped_domain VARCHAR(255) UNIQUE,
install VARCHAR(255) NOT NULL,
registered DATE DEFAULT NULL,
firm_id BIGINT,
practice_area_id BIGINT,
amlaw TINYINT NOT NULL,
subscriber_count BIGINT,
day_view_count BIGINT,
week_view_count BIGINT,
month_view_count BIGINT,
year_view_count BIGINT,
PRIMARY KEY (id),
FOREIGN KEY (firm_id) REFERENCES {$this->get_prefix()}_firms(id),
FOREIGN KEY (practice_area_id) REFERENCES {$this->get_prefix()}_practice_areas(id)
) DEFAULT CHARSET=utf8mb4;";
return $out;
}
In that file, I quickly wrote a DocBlock for each function, and let the OpenAI playground spit out the SQL. I tested the result and suggested some rigorous type-checking for values that should always be formatted as numbers or dates, but that was the only adjustment I had to make. I think that’s the correct use of AI at this moment: You come in with a strong idea of what the result should be, AI fills in the details, and you debate with it until the details reflect what you mostly already knew.
How it’s going
I’ve implemented most of the user stories now. Certainly enough to release an MVP and begin gathering whatever insights this data might have for us:
One interesting data point thus far: Although all the blogs are on the topic of legal matters (they are lawyer blogs, after all), blogs that cover topics with a more general appeal seem to drive more traffic. Blogs about the law as it pertains to food, cruise ships, germs, and cannabis, for example. Furthermore, the largest law firms on our network don’t seem to have much of a foothold there. Smaller firms are doing a better job of connecting with a wider audience. I’m positive that other insights will emerge as we work more deeply with this.
Regrets? I’ve had a few.
This project probably would have been a nice opportunity to apply a modern JavaScript framework, or just no framework at all. I like React and I can imagine how cool it would be to have this application be driven by the various changes in state rather than… drumroll… a couple thousand lines of jQuery!
I like jQuery’s ajax() method, and I like the jQueryUI autocomplete component. Also, there’s less of a performance concern here than on a public-facing front-end. Since this screen is in the WordPress admin area, I’m not concerned about Google admonishing me for using an extra library. And I’m just faster with jQuery. Use whatever you want.
I also think it would be interesting to put AWS to work here and see what could be done through Lambda functions. Maybe I could get Lambda to make all 25 plus 900 requests concurrently with no worries about browser limitations. Heck, maybe I could get it to cycle through IP addresses and sidestep the 429 rate limit as well.
And what about cron? Cron could do a lot of work for us here. It could compile the data on each of the 25 client sites ahead of time, meaning that the initial three-minute refresh time goes away. Writing an application in cron, initially, I think is fine. Coming back six months later to debug something is another matter. Not my favorite. I might revisit this later on, but for now, the cron-free implementation meets the MVP goal.
I have not provided a line-by-line tutorial here, or even a working repo for you to download, and that level of detail was never my intention. I wanted to share high-level strategy decisions that might be of interest to fellow Multi-Multisite people. Have you faced a similar challenge? I’d love to hear about it in the comments!
The graphic and web design world was once a sanctuary of creative freedom, where designers wielded their tools with boundless possibilities, limited only by imagination.
But now, a dark cloud looms over this vibrant industry: the relentless rise of subscription-based services. What was sold to us as a convenient, cost-effective model is now suffocating designers, stifling innovation, and forcing us into a perpetual cycle of dependence.
The Subscription Trap
In the past, owning design software was simple. You bought a product, installed it, and it was yours—forever. Upgrades were optional and came at your own pace.
Today, companies like Adobe, Figma, and countless others have restructured their models to lock designers into expensive monthly subscriptions. On the surface, it seems practical: always have the latest tools and updates. But this isn’t a fair trade; it’s a hostage situation.
The numbers tell the story. Adobe’s Creative Cloud subscription starts at $59.99 per month at the time of this writing for access to essential apps like Photoshop, Illustrator, and InDesign. Over five years, that’s almost a staggering $3,600. For freelancers and small studios, it’s a massive financial burden. And if you stop paying? You lose access to everything. All your files, all your tools—gone.
Creativity on a Clock
The subscription model doesn’t just hurt wallets; it punishes creativity. Deadlines and budgets are already stressful, but the looming threat of losing access to essential tools adds another layer of anxiety. Designers are forced into a “pay-to-play” reality where creativity is a service, not a skill. What happens to innovation when the tools of the trade become gated behind a recurring fee?
Even worse, many subscription services now bundle unrelated features into bloated plans, forcing designers to pay for tools they’ll never use. Want just Photoshop? Too bad. You’ll pay for the entire suite, even if you only need one or two applications. It’s the equivalent of being forced to buy a buffet ticket when all you want is a sandwich.
The New Monopoly on Design
Subscriptions also create a dangerous monopoly on creativity. Companies like Adobe, Figma, and Canva dominate the market, making it nearly impossible for independent or smaller competitors to offer alternatives. As designers, our ability to choose is eroding. The tools we use are dictated by industry standards, which are, in turn, dictated by these subscription giants.
When Figma announced its acquisition by Adobe, the collective gasp from designers worldwide wasn’t just about a business deal—it was about the future of affordable, accessible design tools. The writing is on the wall: consolidation and monopolization will leave designers with fewer options and higher costs.
Who Really Benefits?
It’s not the designers. It’s the corporations. Subscription models provide companies with predictable, recurring revenue streams, ensuring their financial security at the expense of their users. They’re no longer incentivized to create groundbreaking new tools; instead, they focus on incremental updates designed to justify the monthly fee. Meanwhile, designers are left paying more for less.
Breaking the Chains
The solution isn’t simple, but it starts with awareness and action. Designers must support alternatives to the subscription model. Open-source software like GIMP, Krita, and Inkscape offers viable, cost-effective options. Companies that still sell perpetual licenses, such as Affinity, deserve our support and advocacy.
Furthermore, we must collectively demand fairer pricing and licensing models. Why can’t companies offer modular subscriptions or rent-to-own options? Designers should be able to pay for the tools they need, not fund a corporation’s endless greed.
Conclusion: A Call to Arms
The graphic and web design community is one of resilience, creativity, and passion. But we cannot afford to let subscription models dictate our futures. It’s time to push back, explore alternatives, and reclaim the tools that allow us to create freely.
Subscriptions aren’t just killing our wallets—they’re killing the very essence of what it means to be a designer. Let’s break the cycle and rediscover the freedom to create.
As always, we’ve aimed for a range of apps, utilities, and services to help make life a little easier for designers, and for developers too. And, of course, what would a November collection be without some Thanksgiving images for our readers in the US? Enjoy!
This web app lets you run some of the most popular AI tasks directly in your browser. There are currently three tools available, with potentially more coming.
Have you ever had a really exasperating client? Or are you sick of hearing the same complaints over and over again – make the logo bigger, I want a $10k site for five bucks, etc.? This will help relieve your feelings. No actual clients are harmed in the process of reducing your irritation.
ErrorPulse aims to simplify front-end error tracking with helpful features and a minimal dashboard. The free plan covering 5k error credits is an ample trial.
QuickPreview lets you live test HTML in the browser, which could be really handy for fast prototyping or quick demos. Currently, any styles or scripts must be inline.
This easy-to-use little timer app sits on your macOS menu bar, and you just pull it down to set it. It automatically matches your system color scheme, and there is a range of alert sounds to choose from.
This set of seasonal images is bright and joyful. Although it is more general autumnal fruit and veg than turkey and pie, there are a couple of festive pilgrim hats.
Flux AI Lab claims that its AI image generation models are superior to Dall-E and Midjourney. Its suite of tools will create realistic, animated, and illustrated styles, and offers consistency across image sets.
Onlook is an open-source visual editor for React apps. It lets you design in your app and instantly writes all changes to code for you. Some technical knowledge is required.
Alt text is one of those things in my muscle memory that pops up anytime I’m working with an image element. The attribute almost writes itself.
<img src="image.jpg" alt="">
Or if you use Emmet, that’s autocompleted for you. Don’t forget the alt text! Use it even if there’s no need for it, as an empty string is simply skipped by screen readers. That’s called “nulling” the alternative text and many screen readers simply announce the image file name. Just be sure it’s truly an empty string because even a space gets picked up by some assistive tech, which causes a screen reader to completely skip the image:
Probably is doing a lot of lifting there because not all images are equal when it comes to content and context. Emma Cionca and Tanner Kohler have a fresh study on those situations where you probably don’t need alt. It’s a well-written and researched piece and I’m rounding up some nuggets from it.
What Users Need from Alt Text
It’s the same as what anyone else would need from an image: an easy path to accomplish basic tasks. A product image is a good example of that. Providing a visual smooths the path to purchasing because it’s context about what the item looks like and what to expect when you get it. Not providing an image almost adds friction to the experience if you have to stop and ask customer support basic questions about the size and color of that shirt you want.
So, yes. Describe that image in alt! But maybe “describe” isn’t the best wording because the article moves on to make the next point…
Quit Describing What Images Look Like
The article gets into a common trap that I’m all too guilty of, which is describing an image in a way that I find helpful. Or, as the article says, it’s a lot like I’m telling myself, “I’ll describe it in the alt text so screen-reader users can imagine what they aren’t seeing.”
That’s the wrong way of going about it. Getting back to the example of a product image, the article outlines how a screen reader might approach it:
For example, here’s how a screen-reader user might approach a product page:
Jump between the page headers to get a sense of the page structure.
Explore the details of a specific section with the heading label Product Description.
Encounter an image and wonder “What information that I might have missed elsewhere does this image communicate about the product?”
Interesting! Where I might encounter an image and evaluate it based on the text around it, a screen reader is already questioning what content has been missed around it. This passage is one I need to reflect on (emphasis mine):
Most of the time, screen-reader users don’t wonder what images look like. Instead, they want to know their purpose. (Exceptions to this rule might include websites presenting images, such as artwork, purely for visual enjoyment, or users who could previously see and have lost their sight.)
OK, so how in the heck do we know when an image needs describing? It feels so awkward making what’s ultimately a subjective decision. Even so, the article presents three questions to pose to ourselves to determine the best route.
Is the image repetitive? Is the task-related information in the image also found elsewhere on the page?
Is the image referential? Does the page copy directly reference the image?
Is the image efficient? Could alt text help users more efficiently complete a task?
This is the meat of the article, so I’m gonna break those out.
Is the image repetitive?
Repetitive in the sense that the content around it is already doing a bang-up job painting a picture. If the image is already aptly “described” by content, then perhaps it’s possible to get away with nulling the alt attribute.
This is the figure the article uses to make the point (and, yes, I’m alt-ing it):
The caption for this image describes exactly what the image communicates. Therefore, any alt text for the image will be redundant and a waste of time for screen-reader users. In this case, the actual alt text was the same as the caption. Coming across the same information twice in a row feels even more confusing and unnecessary.
The happy path:
<img src="image.jpg" alt="">
But check this out this image about informal/semi-formal table setting showing how it is not described by the text around it (and, no, I’m not alt-ing it):
If I was to describe this image, I might get carried away describing the diagram and all the points outlined in the legend. If I can read all of that, then a screen reader should, too, right? Not exactly. I really appreciate the slew of examples provided in the article. A sampling:
Bread plate and butter knife, located in the top left corner.
Dessert fork, placed horizontally at the top center.
Dessert spoon, placed horizontally at the top center, below the dessert fork.
The second image I dropped in that last section is a good example of a referential image because I directly referenced it in the content preceding it. I nulled the alt attribute because of that. But what I messed up is not making the image recognizable to screen readers. If the alt attribute is null, then the screen reader skips it. But the screen reader should still know it’s there even if it’s aptly described.
The happy path:
<img src="image.jpg" alt="">
Remember that a screen reader may announce the image’s file name. So maybe use that as an opportunity to both call out the image and briefly describe it. Again, we want the screen reader to announce the image if we make mention of it in the content around it. Simply skipping it may cause more confusion than clarity.
Is the image efficient?
My mind always goes to performance when I see the word efficient pop up in reference to images. But in this context the article means whether or not the image can help visitors efficiently complete a task.
If the image helps complete a task, say purchasing a product, then yes, the image needs alt text. But if the content surrounding it already does the job then we can leave it null (alt="") or skip it (alt=" ") if there’s no mention of it.
Wrapping up
I put a little demo together with some testing results from a few different screen readers to see how all of that shakes out.