Archive

Archive for the ‘Others’ Category

One of Those “Onboarding” UIs, With Anchor Positioning

December 2nd, 2024 No comments

Welcome to “Anchor Positioning 101” where we will be exploring this interesting new CSS feature. Our textbook for this class will be the extensive “Anchor Positioning Guide” that Juan Diego Rodriguez published here on CSS-Tricks.

I’m excited for this one. Some of you may remember when CSS-Tricks released the “Flexbox Layout Guide” or the “Grid Layout Guide” — I certainly do and still have them both bookmarked! I spend a lot of time flipping between tabs to make sure I have the right syntax in my “experimental” CodePens.

I’ve been experimenting with CSS anchor positioning like the “good old days” since Juan published his guide, so I figured it’d be fun to share some of the excitement, learn a bit, experiment, and of course: build stuff!

CSS Anchor Positioning introduction

Anchor positioning lets us attach — or “anchor” — one element to one or more other elements. More than that, it allows us to define how a “target” element (that’s what we call the element we’re attaching to an anchor element) is positioned next to the anchor-positioned element, including fallback positioning in the form of a new @position-try at-rule.

The most hand-wavy way to explain the benefits of anchor positioning is to think of it as a powerful enhancement to position: absolute; as it helps absolutely-positioned elements do what you expect. Don’t worry, we’ll see how this works as we go.

Anchor positioning is currently a W3C draft spec, so you know it’s fresh. It’s marked as “limited availability” in Baseline which at the time of writing means it is limited to Chromium-based browsers (versions 125+). That said, the considerate folks over at Oddbird have a polyfill available that’ll help out other browsers until they ship support.

This browser support data is from Caniuse, which has more detail. A number indicates that browser supports the feature at that version and up.

Desktop

Chrome Firefox IE Edge Safari
125 No No 125 No

Mobile / Tablet

Android Chrome Android Firefox Android iOS Safari
131 No 131 No

Oddbird contributes polyfills for many new CSS features and you (yes, you!) can support their work on Github or Open Collective!

Tab Atkins-Bittner, contributing author to the W3C draft spec on anchor positioning, spoke on the topic at CSS Day 2024. The full conference talk is available on YouTube:

Here at CSS-Tricks, Juan demonstrated how to mix and match anchor positioning with view-driven animations for an awesome floating notes effect:

Front-end friend Kevin Powell recently released a video demonstrating how “CSS Popover + Anchor Positioning is Magical”.

And finally, in the tradition of “making fun games to learn CSS,” Thomas Park released Anchoreum (a “Flexbox Froggy“-type game) to learn about CSS anchor positioning. Highly recommend checking this out to get the hang of the position-area property!

The homework

OK, now that we’re caught up on what CSS anchor positioning is and the excitement surrounding it, let’s talk about what it does. Tethering an element to another element? That has a lot of potential. Quite a few instances I can remember where I’ve had to fight with absolute positioning and z-index in order to get something positioned just right.

Let’s take a quick look at the basic syntax. First, we need two elements, an anchor-positioned element and the target element that will be tethered to it.

<!-- Anchor element -->
<div id="anchor">
  Anchor
</div>

<!-- Target element -->
<div id="target">
  Target
</div>

We set an element as an anchor-positioned element by providing it with an anchor-name. This is a unique name of our choosing, however it needs the double-dash prefix, like CSS custom properties.

#anchor {
  anchor-name: --anchor;
}

As for our target element, we’ll need to set position: absolute; on it as well as tell the element what anchor to tether to. We do that with a new CSS property, position-anchor using a value that matches the anchor-name of our anchor-positioned element.

#anchor {
  anchor-name: --anchor;
}

#target {
  position: absolute;
  position-anchor: --anchor;
}

May not look like it yet, but now our two elements are attached. We can set the actual positioning on the target element by providing a position-area. To position our target element, position-area creates an invisible 3×3 grid over the anchor-positioned element. Using positioning keywords, we can designate where the target element appears near the anchor-positioned element.

#target {
  position: absolute;
  position-anchor: --anchor;
  position-area: top center;
}

Now we see that our target element is anchored to the top-center of our anchor-positioned element!

CodePen Embed Fallback

Anchoring pseudo-elements

While playing with anchor positioning, I noticed you can anchor pseudo-elements, just the same as any other element.

#anchor {
  anchor-name: --anchor;

  &::before {
    content: "Target";
    position: absolute;
    position-anchor: --anchor;
    left: anchor(center);
    bottom: anchor(center);
  }
}
CodePen Embed Fallback
a semi-transparent red square labelled "Target" is attached to the upper corner of a blue square labelled "Anchor"

Might be useful for adding design flourishes to elements or adding functionality as some sort of indicator.

Moving anchors

Another quick experiment was to see if we can move anchors. And it turns out this is possible!

CodePen Embed Fallback

Notice the use of anchor() functions instead of position-area to position the target element.

#target {
  position: absolute;
  position-anchor: --anchor-one;
  top: anchor(bottom);
  left: anchor(left);
}

CSS anchor functions are an alternate way to position target elements based on the computed values of the anchor-positioned element itself. Here we are setting the target element’s top property value to match the anchor-positioned element’s bottom value. Similarly, we can set the target’s left property value to match the anchor-positioned element’s left value.

Hovering over the container element swaps the position-anchor from --anchor-one to --anchor-two.

.container:hover {
  #target {
    position-anchor: --anchor-two;
  }
}

We are also able to set a transition as we position the target using top and left, which makes it swap smoothly between anchors.

Extra experimental

Along with being the first to release CSS anchor-positioning, the Chrome dev team recently released new pseudo-selectors related to the

and

elements. The ::details-content pseudo-selector allows you to style the “hidden” part of the

element.

With this information, I thought: “can I anchor it?” and sure enough, you can!

CodePen Embed Fallback

Again, this is definitely not ready for prime-time, but it’s always fun to experiment!

Practical examinations

Let’s take this a bit further and tackle more practical challenges using CSS anchor positioning. Please keep in mind that all these examples are Chrome-only at the time of writing!

Tooltips

One of the most straightforward use cases for CSS anchor positioning is possibly a tooltip. Makes a lot of sense: hover over an icon and a label floats nearby to explain what the icon does. I didn’t quite want to make yet another tutorial on how to make a tooltip and luckily for me, Zell Liew recently wrote an article on tooltip best practices, so we can focus purely on anchor positioning and refer to Zell’s work for the semantics.

CodePen Embed Fallback

Now, let’s check out one of these tooltips:

<!-- ... -->;
<li class="toolbar-item">;
  <button type="button" 
    id="inbox-tool" 
    aria-labelledby="inbox-label" 
    class="tool">
    <svg id="inbox-tool-icon">
      <!-- SVG icon code ... -->
    </svg>
  </button>

  <div id="inbox-label" role="tooltip">
    <p>Inbox</p>
  </div>
</li>
<!-- ... -->

The HTML is structured in a way where the tooltip element is a sibling of our anchor-positioned

Categories: Designing, Others Tags:

15 Best New Fonts, November 2024

December 2nd, 2024 No comments
01 formiga

Welcome to our monthly roundup of the best new fonts we’ve found online in the previous four weeks.

November’s edition is particularly strong for script-inspired sans, display faces with motion and energy, and a charming blackletter font. Enjoy!

Formiga

Formiga is a versatile, humanist display typeface designed for attention-grabbing editorials. It has a clear, confident, and fun style, making it perfect for B2C branding projects. There are seven weights (from light to black) with a ton of OpenType features.

Geoscript

GeoScript is a lovely chunky brush script with three weights: Light, Medium, and Bold. It effortlessly combines handwritten and geometric elements making it perfect for a corporate logo with a bit more personality.

Anathera

Anathera is a bold, minimalist typeface with a distinctive pixelation of its diagonal strokes. At large sizes the pixelated details feel disruptive, and at small sizes they create a sense of motion in the text. It’s a great choice for any project that needs to stand out.

03 anathera

Morph

Morph is a versatile display type system built from several different stencil and doubleline styles. It’s a clean modern family with enough variety to create visually interesting word shapes. It’s a great choice for logos, editorial, signage, and poster designs.

04 morph

Brisca Miera

Brisca Miera is a graceful serif, with swashes and alternate characters to add variety and a sense of hand-crafted typography. It’s an excellent option any time you need a font for a sophisticated design.

05 brisca

Dorat

Dorat is an extremely modern blackletter font. The style is vastly under-used — thanks to some unhappy associations — and this font provides the opportunity to lean into a visually strong style while remaining distinctly contemporary.

06 dorat

Mars

Mars is a carefully drawn humanist sans, offering a more interesting corporate face than fonts like Helvetica. It features four different weights: Condensed, Standard, SemiCondensed, and Extended; providing tremendous versatility. It’s a good choice for a company looking for an original voice in its communications.

07 mars

Milling

Milling was designed after research into the shapes that lend themselves to CNC machining. There are three styles: Simplex, Duplex, and Triplex. It’s a practical font for anyone machining lettering, but it’s also an excellent font for conveying industrial production.

08 milling

Wild Lines

Wild Lines is a straight, urban graffiti font that captures the raw energy and edginess of street culture. Its clean, linear style adds structure while preserving graffiti’s rebellious feel. Ideal for projects needing an urban, bold aesthetic, it brings a fresh, city-inspired vibe to designs.

09 wildlines

LiebeHeide Fineliner

LiebeHeide Fineliner is a friendly script font with clean lines and smooth curves. Featuring plenty of ligatures and alternatives, it ensures natural, organic text flow and unique designs. It adds a personal touch to any project.

10 liebeheide

Arkbro

Arkbro is a variable typeface inspired by Ellen Arkbro’s track Mountain of Air, reflecting geometric principles and spatial harmony. Featuring extreme weight contrasts, it’s ideal for playful, animated displays. It’s an excellent choice for designs relating to music.

11 arkbro

Hentak

Hentak is a script font with a flowing, handwritten style and a vintage feel. Featuring dramatic curves and ornamental details, it combines modern and classic calligraphic qualities.

12 hentak

Formale Grotesque

Formale Grotesque is a sans-serif typeface inspired by a 1930s alphabet board, blending geometric and dynamic forms with low stroke contrast. Its alternates shift from grotesque to humanist styles, adding versatility and depth.

13 formalegrotesque

Gonzaga

Gonzaga is a vibrant modern slab serif with contemporary lines and playful rhythms. It has an energy that makes it perfect for logos, posters, and hero text on the web.

14 gonzaga

VFU

VFU (Visionary Font Ultra) is a humble sans with flared strokes that lean in the direction of serifs. It comes with nine weights with matching italics, and a display version.

15 vfu
Categories: Designing, Others Tags:

Blast from the Past: 2Advanced.com

December 1st, 2024 No comments

In the early 2000s, the internet was undergoing a massive transformation. Websites were no longer static, text-heavy pages but were becoming immersive digital experiences.

At the forefront of this revolution stood 2Advanced Studios, a name synonymous with cutting-edge web design and innovation. Their website, 2Advanced.com, became an iconic benchmark in the history of the internet, inspiring countless designers and developers.

2Advanced Studios, have reimagined their iconic 2001 V3 website for 2024. This modern iteration is built entirely using Rive and React JS, showcasing their commitment to blending classic aesthetics with contemporary web technologies.

The updated site maintains the futuristic design elements that originally set 2Advanced apart, now enhanced with responsive layouts and interactive animations. This revival not only pays homage to their influential past but also demonstrates their adaptability and continued innovation in the evolving digital landscape.

For a visual journey through their design evolution, you can explore their official YouTube channel, which features content highlighting their creative milestones, or check their X account for the latest news.

The Rise of 2Advanced Studios

Founded in 1999 by Eric Jordan, 2Advanced Studios carved a niche for itself by creating visually stunning and highly interactive websites. At a time when Flash was the dominant technology for creating dynamic online experiences, 2Advanced leveraged its full potential. Their designs were not just websites; they were art.

The company’s flagship website, 2Advanced.com, was a masterpiece of futuristic aesthetics. With themes that drew heavily from cyberpunk and sci-fi inspirations, the site featured sleek metallic textures, neon-lit interfaces, and fluid animations. The navigation was seamless yet experimental, challenging users to explore its digital landscape.

Design That Defined an Era

The site’s most famous iterations—such as “Expedition,” “Prophecy,” and “V5 Ascension”—demonstrated 2Advanced’s mastery of Flash. Each redesign brought a new visual theme that captured the zeitgeist of the web design community. The vibrant colors, cinematic transitions, and immersive soundscapes were unparalleled.

2Advanced wasn’t just about aesthetics. Their work emphasized usability and storytelling, blending form and function to create experiences that were intuitive despite their visual complexity. For many, visiting 2Advanced.com wasn’t just about browsing—it was an event.

Impact on the Web Design Community

2Advanced.com became a beacon of inspiration for web designers and developers worldwide. It showcased what was possible with emerging web technologies and pushed the boundaries of creativity. Forums and design communities buzzed with admiration and analysis of its every iteration. Tutorials and blogs dissected the techniques behind its magic, attempting to emulate its style.

The studio’s influence extended beyond its own website. Clients ranged from Fortune 500 companies to major entertainment brands, all seeking 2Advanced’s ability to merge futuristic visuals with cutting-edge interactivity.

The Downfall of Flash and 2Advanced’s Legacy

As the web evolved, so did its technologies. The rise of mobile devices and the growing importance of responsive design marked the decline of Flash. By the late 2000s, HTML5, CSS3, and JavaScript had become the new standards for creating dynamic web content. These changes rendered Flash-based designs obsolete, and 2Advanced Studios eventually shuttered its operations in the early 2010s.

Yet, the legacy of 2Advanced.com lives on. It remains a cultural touchstone in the history of web design, a reminder of a time when creativity reigned supreme. Its influence can still be seen in modern designs that prioritize user engagement and visual storytelling.

Why It Still Matters

In today’s world of minimalist and utilitarian design, revisiting 2Advanced.com feels like stepping into a time capsule. It reminds us of the boundless enthusiasm and optimism of the early internet era—a time when designers dared to dream big and technology felt like magic.

For those who lived through the golden age of Flash, 2Advanced.com was more than just a website; it was a movement. And for those discovering it for the first time, it stands as a testament to what’s possible when creativity and technology collide.

The spirit of 2Advanced endures, inspiring new generations of designers to push boundaries and create experiences that captivate and inspire.

Categories: Designing, Others Tags:

NEAT: Your Go-To Tool for Stunning 3D Gradients

November 30th, 2024 No comments

Looking to add an eye-catching touch to your website? NEAT is here to help!

This innovative tool allows you to create beautiful animated 3D gradients that can be used for hero backgrounds or other design elements on your site. It’s incredibly easy to use and gives you endless creative possibilities.

With NEAT, you can input between 1 and 5 colors of your choice. These colors are then processed through a Perlin noise function, which creates smooth transitions and unique patterns. The result? A visually captivating gradient with a natural, flowing feel that draws attention without overwhelming your design.

What’s more, NEAT is all about customization. You can tweak various settings, like noise intensity, blending modes, and color schemes, to craft the perfect gradient that fits your vision. Whether you’re a seasoned web designer or just starting out, this tool offers something for everyone.

So why stick with plain, static backgrounds when you can make your website stand out with dynamic, artistic gradients?

Try NEAT today and bring your design ideas to life in the most colorful way possible!

Check out NEAT now

Categories: Designing, Others Tags:

Key Psychological Principles in Web Design: A Deep Dive

November 29th, 2024 No comments

Web design is not merely an exercise in aesthetics; it is a blend of art and science where psychology plays a crucial role. Understanding how users think, perceive, and behave online allows designers to create interfaces that feel intuitive, trustworthy, and engaging.

By incorporating psychological principles, web designers can craft experiences that resonate with users, leading to higher engagement, better usability, and stronger brand loyalty. This article explores key psychological principles in web design, illustrated with examples.

The Principle of Visual Hierarchy

Visual hierarchy refers to the arrangement and prioritization of elements on a web page to guide users’ attention. The human brain naturally seeks order, and effective web design leverages this by organizing content in a way that aligns with users’ expectations.

For example, consider the homepage of a news website. Headlines are often displayed in bold, large fonts at the top of the page, while subheadings and body text are smaller and less prominent. This hierarchy directs the user’s focus to the most important content first. The New York Times website exemplifies this principle by combining size, color, and spacing to create a clear flow of information. Large headlines draw attention, images add visual weight, and subtle dividers delineate sections.

Gestalt Principles of Perception

Gestalt psychology explores how people perceive patterns and organize visual information. Several Gestalt principles are particularly relevant to web design, such as proximity, similarity, continuity, and closure.

For instance, the principle of proximity suggests that elements placed close together are perceived as related. Designers use this principle to group navigation links, making it clear that they are part of a cohesive menu. On e-commerce platforms like Amazon, products are grouped into categories with clear visual boundaries, helping users quickly identify relationships between items.

The principle of similarity, on the other hand, dictates that items sharing visual characteristics (e.g., color, shape, or size) are seen as part of the same group. This is why buttons across a website often share the same style—users instinctively recognize them as interactive elements.

Hick’s Law and Decision Making

Hick’s Law states that the time required to make a decision increases with the number of choices available. This principle highlights the importance of simplifying options to avoid overwhelming users.

A prime example of Hick’s Law in action is seen on landing pages for subscription services like Netflix. Instead of presenting a complex array of pricing plans and features upfront, Netflix streamlines the decision-making process by guiding users with a single call-to-action: “Start Your Free Trial.” Additional options and details are revealed incrementally, reducing cognitive load and encouraging users to proceed.

The Role of Color Psychology

Color profoundly influences user behavior and perception, evoking specific emotions and associations. In web design, color can enhance brand identity, guide attention, and affect user decisions.

For example, financial websites often use blue as a primary color due to its association with trust and stability. PayPal employs a blue-dominated color scheme, reinforcing its image as a reliable payment platform. Similarly, e-commerce websites like Amazon use vibrant orange or yellow for “Buy Now” buttons, leveraging these colors’ associations with urgency and positivity.

The Fitt’s Law and Interactive Design

Fitt’s Law states that the time to acquire a target is a function of the distance to and size of the target. In web design, this principle emphasizes the importance of designing buttons and interactive elements that are easy to locate and click.

Apple’s website demonstrates this principle effectively. Navigation menus are spacious, and clickable elements such as buttons and icons are large enough to ensure usability across devices. This is especially crucial in mobile design, where smaller screens demand thoughtful placement and sizing of touch targets.

Cognitive Load and Simplicity

Cognitive load refers to the amount of mental effort required to process information. Websites with cluttered designs or excessive text can overwhelm users, leading to frustration and abandonment.

Google’s homepage is an archetype of simplicity. With its iconic search bar as the focal point, the page minimizes distractions and allows users to focus entirely on their primary task: searching. By reducing cognitive load, Google ensures that users can interact with the site effortlessly.

The Power of Reciprocity in Persuasion

The principle of reciprocity, rooted in social psychology, suggests that people are inclined to return favors. In web design, this principle can be applied to foster goodwill and encourage user action.

HubSpot, for instance, offers free resources such as e-books, templates, and tools. This creates a sense of reciprocity, making users more likely to engage with their paid services in the future. By offering value upfront, the website establishes trust and fosters a positive relationship with its audience.

Trust and Social Proof

Trust is a cornerstone of user experience, and social proof is a powerful psychological mechanism for building it. Social proof includes user reviews, testimonials, ratings, and case studies, which reassure potential users by demonstrating that others have had positive experiences.

TripAdvisor exemplifies this principle by prominently displaying reviews and ratings for hotels, restaurants, and attractions. The volume and variety of reviews give users confidence in making informed decisions, reducing uncertainty.

Scarcity and Urgency in E-Commerce

Scarcity creates a sense of urgency, motivating users to act quickly. This psychological principle is frequently used in e-commerce to drive conversions.

Booking.com leverages scarcity by showing messages like “Only 2 rooms left!” or “10 people are looking at this hotel.” These prompts tap into the fear of missing out (FOMO), nudging users to complete their bookings promptly.

Anchoring Effect in Pricing Strategies

The anchoring effect occurs when people rely heavily on the first piece of information they encounter. In web design, this principle is often used in pricing tables.

For instance, SaaS companies like Adobe Creative Cloud display their premium plans alongside lower-cost alternatives. By showcasing the higher-priced option first, users perceive the subsequent, lower-priced plans as more affordable, even if they are still relatively expensive.

Conclusion

Incorporating psychological principles into web design is not just about making a website visually appealing; it’s about creating an intuitive and engaging experience that aligns with human behavior.

By understanding concepts like visual hierarchy, Gestalt principles, Hick’s Law, and cognitive load, designers can craft interfaces that are not only functional but also emotionally resonant.

The interplay between psychology and design ensures that websites are more than digital storefronts—they become meaningful spaces that cater to users’ needs and desires.

Categories: Designing, Others Tags:

Artificial Intelligence in Fraud Prevention: What Online Businesses Need to Know 

November 29th, 2024 No comments

Online businesses must deal with fraud at one point or another, which is an unfortunate but inevitable event. Artificial Intelligence (AI) has become an additional and helpful tool to help online companies detect, prevent, and respond to fraudulent activities before they become a notable issue; this is a relief in the face of the overwhelming task of fraud prevention. By using AI capabilities, companies can be safer by gaining a considerable advantage against fraudsters. 

How AI Changes the Game in Fraud Prevention

Let’s be truthful here; we all know that AI isn’t perfect, but it’s already been a real game-changer in a crime world, including the fraudulent one, being able to analyze large amounts of data in real-time, highlight various patterns of possible fraud in seconds, and continuously improve via machine learning, with our help of course. 

What is more important, unlike in the case of other systems, AI fraud detection tools develop an understanding of the common marks of fraudulent activities, unlike traditional systems, which can take 10 times more time to understand what is going on. 

Pattern Recognition

AI does very well in environments where pattern recognition is needed the most. Fraudulent behavior often follows a pattern, such as a strange location for an account login when using a VPN (Virtual Private Network) or several quick purchases over a short period. AI algorithms can detect these anomalies and flag them for review.

Advanced machine learning models are far superior in fraud versus non-fraud transaction classification. For example, an AI system can analyze historical customer behavior to uncover outliers that may remain unseen by a human analyst or other traditional systems. The more the models learn about legitimate behavior, the more accurately they pinpoint unusual activities.

Speed and Efficiency

AI-powered fraud prevention tools work in real-time, meaning they can analyze transactions when they occur; it is essential to mitigate risks since fraudsters often act quickly to move funds before any detection to which they gain access.

It also uses AI to drastically reduce the load on human fraud analysts (yes, AI can’t be left alone, not yet) by filtering out all legitimate transactions and highlighting only those that need further investigation. In this way, it keeps online businesses running smoothly while also improving their security. 

Adaptive Learning

The ability to adapt over time makes AI and machine learning different. While fraudsters continue evolving tactics to bypass existing security, AI tools are not stationary – they learn and improve as they encounter new data. Such adaptability puts AI-powered systems in an excellent position to match and often outpace evolving fraud techniques, providing a sense of security in the face of changing threats.

Best Practices for AI Implementation in Fraud Prevention

With high-quality data investment, an online business would be on the right path to capitalize on AI to prevent fraud. Since AI systems learn from training data (that makes users the teachers), they need appropriate inputs for accurate outputs. Therefore, precise and comprehensive data collection will be prime significance for a business.

The Know Your Customer (KYC) process is essential to financial institutions, so it is equally important that online businesses should have a Know Your Business (KYB) policy as well. That means deep knowledge of the unique risk factors relating to your company; businesses could develop such insights with the help of AI. It pays off in the long term, making the most of AI-based solutions, so investing in comprehensive data collection and regular audits is always recommended.

AI-Powered Fraud Prevention: The Advantages for Online Businesses

AI has several unique advantages over traditional fraud prevention methods. Let’s look at the significant benefits that AI can provide to online businesses.

Increased Accuracy

The ability of AI to detect fraud far outstrips traditional rule-based systems. Machine learning models process large volumes of data, learning to flag even the slightest hints of fraud that human analysts could overlook because all humans can make mistakes. By reducing false positive instances where legitimate transactions are fraudulent, AI helps ensure that legitimate customers are not disturbed.

Reduced Operational Costs

Fraud prevention can be quite resource-intensive, so here comes AI to help by reducing operation costs by automating processes and reducing the number of Fraud Prevention team members an organization needs to deal with. This efficiency in cost reduction can be a relief for many businesses. Human intervention will likely always be required; as we said before, AI can’t be left alone for any given case. 

Scalability

Another point might be that rapid growth is quite challenging for businesses dealing with scaling fraud prevention systems. AI-based fraud detection systems are fully scalable; with an increased transaction volume, their efficiencies remain uncompromised, making perfect sense for e-commerce platforms that show seasonal sales spikes. 

Steps of online businesses in integrating AI against fraud

If you are an e-commerce business looking to implement AI in fraud prevention, you should take several steps, which are:

Assess Your Current Fraud Prevention Strategy

First, start by identifying the weaknesses in your current fraud prevention processes: Are you experiencing false positives? Is your team sinking in manual reviews? Such questions will give you an idea of what AI tools will most benefit business.

Partner with a reliable AI vendor 

The key is choosing the right vendor; you should look at an AI solution with extensive experience in fraud prevention within one’s industry and a track record of success. Also, ensure the vendor provides flexible APIs, allowing different software systems and various programs to communicate for easy integration with your current systems. 

Train and Implement

Most AI-based solutions for fraud prevention do need some upfront training in the form of tuning their algorithms. Work closely with your provider to feed them historical data from which the business could use AI. Test the system in a controlled environment before fully deploying it.

Monitor and Optimise

You must constantly monitor your AI system’s performance after it has been fully deployed on your systems. Fraud detection is not a set-and-forget job; periodically check and adjust your system’s accuracy as necessary, with changing fraud patterns and evolving business needs.

Challenges to Note in Using AI for Fraud Prevention

It is important to note that AI has enormous advantages, but it is not all smooth sailing. It could be hard for smaller businesses with limited resources to implement AI, but if you want to, it is very achievable. 

Businesses should consider GDPR laws while deploying AI systems because compliance is a huge factor, especially concerning sensitive customer data. It should be noted that businesses must partner with AI providers that understand how data is being used and protected. 

AI in Future Fraud Prevention 

Improvements in AI and machine learning will dominate the future of fraud prevention; as various technologies improve, fraud detection tools will be more accurate, responsive, and user-friendly. We will see more businesses using AI for proactive threat assessment, such as fraud prediction, even before it happens.

AI will also be of prime importance in more extensive verification processes such as KYB and KYC, thus helping organizations do much more than fraud detection by building trust in customers and partners. AI is not a short-term trend in fraud prevention but a long-term solution with continuous evolvement depending on the nature of fraud.

Conclusion

Fraud is a constant plague in the modern world, but AI gives online businesses new and powerful tools to fight it. Companies can minimize their risks using AI for anomaly detection, real-time analysis, and KYB verification while improving accuracy, efficiency, precision, and better customer experience.

If your online business has not shifted to improve fraud prevention, now is the time to implement AI in your strategy. It would be an investment worth making with peace of mind, having an edge over the competitors in this rapid digital world.

Featured image by Shuto Araki on Unsplash

The post Artificial Intelligence in Fraud Prevention: What Online Businesses Need to Know  appeared first on noupe.

Categories: Others Tags:

How To Use Dynamic Content For High-performing Holiday Email Marketing

November 28th, 2024 No comments

Around 75% of consumers lean into the email channel for both promotional and transactional updates from brands.

But it is even more preferred when the holiday season hits.

94.4% of consumers say they need transactional messages during this busy time, and 79.8% are willing to receive personalized promotions. 

The point is : Email is the channel consumers rely on to find the best Black Friday deals and holiday offers. Great news for brands and marketers who love the low cost of sending emails.

But it also means that they are under ever-greater pressure to ensure that the right messages reach the inbox at the right time to holiday shoppers who are busier (and choosier) than ever.

With promotions and seasonal deals piling up in email inboxes faster than snow on a December morning, your emails will have to connect with your audience on a 1:1 level. 

Email personalization using dynamic content is your chance to nail that. 

Dynamic email content offers countless opportunities to reach out to your subscribers with heartfelt, personalized messaging that spreads cheer this holiday season. It can also help you nurture more leads by being at the forefront of their minds when they think about their holiday shopping needs. 

So, if you are looking for clever ways to use dynamic email content for holiday email marketing, you’ve come to the right place. 

But before we peel back the layers of different ways you can use dynamic content in holiday emails, let’s get over what dynamic content is. 

What Is Dynamic Email Content?

Say you have a prospect named Ken. Ken hasn’t bought anything from you yet, but thanks to the dashboard full of data, you know that Ken has been interested in product X. 

Now, here’s my question—

Would you send Ken a yawn-inducing welcome email full of generic deals on unrelated items? Or would you personalize the welcome email with some irresistible deals on the very product he’s interested in?

If you are serious about delivering personalized (read: relevant) shopping experiences that holiday shoppers want, you will find ways to fine-tune your holiday email campaigns based on who opens them, right?

Well, my friend, that’s done by dynamic email content. And it’s one of the expert-recommended email marketing strategies for this holiday season.

Dynamic email content is a personalized email element that changes based on subscriber behavior, interests, or order history. 

This could mean that the highlighted products in an upcoming sale change based on the recipient’s interest, visual content that adapts images based on the customer’s gender preferences, or email design that adjusts based on the local weather, customs, or cultural contexts. 

While basic email personalization, like including the recipient’s name in the subject line, plays a part in holiday email personalization, dynamic email content takes things further. It allows marketers to draw on subscribers’ data and behavior and send one campaign, one time, that is optimized and targeted to every individual. 

Meaning that the entire email doesn’t have to be unique for every single customer. Only some aspects should be unique and according to the individual subscriber’s preferences. 

Top 6 Ways to Energize Your Holiday Email Campaigns With Dynamic Content

Some ways to use dynamic content and make your holiday email flows more effective are:

  1. Countdown Timers

Countdown timers in holiday emails tell your subscribers that the holiday deal is about to disappear or that prices are going to soar after the timer stops. 

Each time a customer opens the email, they get a real-time reminder of the remaining days, hours, minutes, and seconds. Not to mention the adrenaline rush and the FOMO of losing out on your brand’s special offer. It pushes them to grab it before it’s too late.

And it’s not just urgency or scarcity, either. Including countdown timers in your holiday email campaigns helps customers plan better during the busy holiday season. 

However, they are also a matter of trust. So, if the email says the offer ends today, it should end today. 

  1. Recommended or Popular Gift Ideas

The more personalized your gift recommendations, the more your customers will spend this holiday season. 

One of the simplest and most effective ways to do this is to send dynamic emails that suggest tailored or popular gifts. 

They are game-changers, especially during the holiday season. Your subscribers are looking for gift ideas. Show them personalized gift ideas based on their past purchases and browsing behavior. Better yet, curate a gift guide with popular items, best sellers, or bundle products frequently bought together. 

By using product feeds and audience segmentation, you can help shoppers find the perfect gift for loved ones. 

  1. Loyalty Programs Update

Loyal customers are worth celebrating. We all count on them to keep sales rolling in, don’t we? 

But to keep them choosing your brand, especially during the holiday season, you must keep them delighted with rewards. Otherwise, they might just wander off to competitors offering attractive holiday deals. 

And if dynamic loyalty programs don’t do that for you, what will?

Holiday emails with dynamic loyalty points show subscribers how many points they’ve earned after their past purchases. They remind them of the rewards they can unlock using these points—exclusive discounts, festive offers, free gifts, or limited-time perks. 

Each point reflects their engagement with your brand, like purchases, referrals, or other meaningful interactions. And wouldn’t you be thrilled at the prospect of redeeming points right when you’re ready to shop? Likewise, for your subscribers. 

The excitement of redeeming points nudges them closer to completing their shopping carts and makes them feel valued. They, after all, have stuck with you through thick and thin.

Just remember, this holiday email marketing strategy only works if your campaigns have been consistently rewarding loyal customers all year round.

  1. Shipping and Order Tracking

You have worked so hard to create and market an amazing product. I am sure you don’t want a poor post-purchase experience to be the only thing your customers remember about your brand. 

So, remember this–

Holiday shoppers shouldn’t have to refresh the tracking page repeatedly and wonder if their holiday gift will arrive on time. 

With dynamic shipment and order tracking emails, real-time tracking information is embedded in the email itself. They update customers conveniently to check their order status without clicking through a separate page. Instead of a static delivery estimate that we normally see in standard emails, these emails feature a live tracking graphic that keeps updating in real time. 

It’s a straightforward yet effective way to add a touch of reassurance to your holiday emails that their holiday gifts are en route so that they have one less thing to worry about. 

  1. Product Stock Updates

Having your heart set on a gift, only to find it sold out, is not a good feeling for holiday shoppers and your brand. 

Try dynamic emails this festive season and display up-to-date stock updates. This will tell customers when they have to rush to place an order. This very helpful urgency will help you drive purchases before the must-haves are gone. 

Take it a notch up and remove products from the emails that are running low on stock. This will prevent shoppers from getting frustrated by sold-out products. 

Nothing’s worse for holiday shoppers (or your brand) than a customer who has their heart set on a gift only to find it sold out. 

Try dynamic emails with real-time stock updates to save yourself from embarrassment. These dynamic elements are a must for your holiday email workflows as they tell customers when to rush to place an order. The undertones of urgency drive purchases before the must-haves disappear. 

Take it a notch further by automatically removing low-stock items from emails. This spares shoppers the disappointment of seeing unavailable products, keeping their holiday spirits intact.

  1. Geolocation

The logic here is sound: know your subscribers’ geographic locations and send them targeted, location-based content that resonates with their specific region. 

That’s much more meaningful than saying, “Get cozy with our hot cocoa gift set!” to someone planning a beach barbecue.

By asking for a subscriber’s zip code when they sign up, you have the chance to deliver location-based offers, time-zone-specific sends, and even maps to the nearest store.

Another holiday email marketing strategy that makes sense for global brands is tailoring email visuals and messaging using geolocation.

For instance, traditional winter themes could be used for subscribers in the Northern Hemisphere and sunny, beachy themes for subscribers down South. 

Such emails make for an engaging and relevant holiday shopping experience because they tailor content and design to match the local climate, making it uniquely suited to individual subscribers. 

Wrapping Up 

It should be pretty clear by now that dynamic email elements are a truly unique feature that you can use in your holiday email marketing campaigns. They make your email design stand out from your competition and trigger a feeling of urgency in your subscribers.

Sure, crafting dynamic email campaigns for the holiday season takes time, creativity, and planning. But it is worth every bit of effort. 

Just be sure to thoroughly test your emails to catch any potential rendering issues and ensure they reach your audience’s inbox without a hitch.

Featured Image by Annie Spratt on Unsplash

The post How To Use Dynamic Content For High-performing Holiday Email Marketing appeared first on noupe.

Categories: Others Tags:

20 Best New Websites, November 2024

November 28th, 2024 No comments

Welcome to November’s new sites collection. 

Minimalism continues to be a dominant trend among well designed websites, but it is clear that minimal does not mean visually dull. Minimalist design can incorporate color, animation, and even decorative fonts, as long as restraint is exercised.

On the other hand, a strong site architecture with a clear and robust structure can convey a sense of simplicity, even if the visual design is more elaborate. When content is organized, users will feel more comfortable navigating the site. Enjoy!

Muskegon Art Museum

Muskegon Art Museums’ site uses crisp black and white along with clean typography to create a bold, modern look that lets the images used stand out.

I See My Light Shining

Vibrant, characterful illustrations help bring to life this collection of oral testimonies from over 200 elders, including activists and community builders, who witnessed and helped shape change in American society.

Hugmun

This portfolio site for Hugmun creative studio makes clever use of a central slideshow to create a structure that can present plenty of content for an individual project while keeping others within easy reach.

003 1

Emergence Magazine

Emergence Magazine is a magazine and creative studio that explores the connections between ecology, culture, and spirituality through storytelling and art across various mediums. Interviews and essays sit alongside films and immersive web experiences on a calm, unobtrusive backdrop.

004 1

RSPCA Animal Futures

This interactive experience from the RSPCA (Royal Society for the Protection of Animals) explores the impact that technology, climate change, political decisions, and even our dietary choices will have on the future. The illustration style is friendly without being too cutesy, and the gamified format allows information to be presented in digestible chunks.

005 1

Duten

The minimalist design of Duten’s website reflects the minimalist style of its product range. Considered animation effects add a layer of sophistication.

006 1

Lifeworld

Lifeworld is an artwork by Olafur Eliasson for WeTransfer as guest curator of its artist platform. The use of black and white and the irregular grid layout creates drama and an interesting rhythm.

007 1

Gelato La Boca

This site for Gelato La Boca is bright with a fun, almost comic-book feel. The color scheme is actually quite minimal, but because of how the colors are used, it seems like more.

008 1

Commissioner of Design

The anarchic style of this portfolio website shows off the designer’s creativity, and it is backed up by well-structured case studies.

009 1

Oakame

This is an appealingly minimalist site. Several design elements, such as the product details and customization boxes, and the display type, reflect the style of the products sold.

010 1

Skillbard

Skillbard has recently rebranded, and this website is part of that new brand identity. It has a sense of playfulness about it, with wiggly and animated type and a color scheme that changes randomly.

011 1

Hurry Up, We’re Dreaming

HUWD is a new platform for challenging how technology is developed and deployed with the aim of adopting a more thoughtful approach. The logotype has a deliberate liquidness, and the occasional color gradients give an ethereal feel.

012 1

The Farm Society

The combination of lively illustration and close-up produce photographs add personality and vibrancy to this otherwise minimal site.

013 1

Harry’s Inc.

The soft color scheme here exudes warmth in this brand group site. This approach avoids a corporate feel and presents a much more personable identity.

014 1

Insight

The clever landing page concept of a contact sheet with magnifier piques the user’s interest before leading to a well-organized, easy to navigate agency portfolio.

015 1

Finch Hatton MTB

The custom type here conveys movement and dynamism. The color scheme is earthy, with bright accents that add vibrancy.

016 1

Smart Playrooms

The color scheme here is muted, allowing the photographs to provide the bulk of the visual interest. The overall feel is clean while also welcoming.

017 1

Otherlife

This portfolio site for creative agency Otherlife focuses almost entirely on case studies. These are well presented with plenty of images and concise supporting text. The agency’s own branding is minimal and avoids intruding.

018 1

Docky

Docky is an Airbnb type platform connecting boat owners with berths. This supporting website splits into two to cover renter and rentee services separately. Animation and simple illustration add depth.

019 1

ClearSpace x Omega

Watchmaker Omega is promoting its support of the ClearSpace project to remove manmade debris from space. Animation and illustration combine to create an impactful and informative experience.

020 1
Categories: Designing, Others Tags:

WordPress Multi-Multisite: A Case Study

November 27th, 2024 No comments

The mission: Provide a dashboard within the WordPress admin area for browsing Google Analytics data for all your blogs.

The catch? You’ve got about 900 live blogs, spread across about 25 WordPress multisite instances. Some instances have just one blog, others have as many as 250. In other words, what you need is to compress a data set that normally takes a very long time to compile into a single user-friendly screen.

The implementation details are entirely up to you, but the final result should look like this Figma comp:

Design courtesy of the incomparable Brian Biddle.

I want to walk you through my approach and some of the interesting challenges I faced coming up with it, as well as the occasional nitty-gritty detail in between. I’ll cover topics like the WordPress REST API, choosing between a JavaScript or PHP approach, rate/time limits in production web environments, security, custom database design — and even a touch of AI. But first, a little orientation.

Let’s define some terms

We’re about to cover a lot of ground, so it’s worth spending a couple of moments reviewing some key terms we’ll be using throughout this post.

What is WordPress multisite?

WordPress Multisite is a feature of WordPress core — no plugins required — whereby you can run multiple blogs (or websites, or stores, or what have you) from a single WordPress installation. All the blogs share the same WordPress core files, wp-content folder, and MySQL database. However, each blog gets its own folder within wp-content/uploads for its uploaded media, and its own set of database tables for its posts, categories, options, etc. Users can be members of some or all blogs within the multisite installation.

What is WordPress multi-multisite?

It’s just a nickname for managing multiple instances of WordPress multisite. It can get messy to have different customers share one multisite instance, so I prefer to break it up so that each customer has their own multisite, but they can have many blogs within their multisite.

So that’s different from a “Network of Networks”?

It’s apparently possible to run multiple instances of WordPress multisite against the same WordPress core installation. I’ve never looked into this, but I recall hearing about it over the years. I’ve heard the term “Network of Networks” and I like it, but that is not the scenario I’m covering in this article.

Why do you keep saying “blogs”? Do people still blog?

You betcha! And people read them, too. You’re reading one right now. Hence, the need for a robust analytics solution. But this article could just as easily be about any sort of WordPress site. I happen to be dealing with blogs, and the word “blog” is a concise way to express “a subsite within a WordPress multisite instance”.

One more thing: In this article, I’ll use the term dashboard site to refer to the site from which I observe the compiled analytics data. I’ll use the term client sites to refer to the 25 multisites I pull data from.

My implementation

My strategy was to write one WordPress plugin that is installed on all 25 client sites, as well as on the dashboard site. The plugin serves two purposes:

  • Expose data at API endpoints of the client sites
  • Scrape the data from the client sites from the dashboard site, cache it in the database, and display it in a dashboard.

The WordPress REST API is the Backbone

The WordPress REST API is my favorite part of WordPress. Out of the box, WordPress exposes default WordPress stuff like posts, authors, comments, media files, etc., via the WordPress REST API. You can see an example of this by navigating to /wp-json from any WordPress site, including CSS-Tricks. Here’s the REST API root for the WordPress Developer Resources site:

The root URL for the WordPress REST API exposes structured JSON data, such as this example from the WordPress Developer Resources website.

What’s so great about this? WordPress ships with everything developers need to extend the WordPress REST API and publish custom endpoints. Exposing data via an API endpoint is a fantastic way to share it with other websites that need to consume it, and that’s exactly what I did:

Open the code

<?php

[...]

function register(WP_REST_Server $server) {
  $endpoints = $this->get();

  foreach ($endpoints as $endpoint_slug => $endpoint) {
    register_rest_route(
      $endpoint['namespace'],
      $endpoint['route'],
      $endpoint['args']
    );
  }
}

function get() {

  $version = 'v1';

  return array(
      
    'empty_db' => array(
      'namespace' => 'LXB_DBA/' . $version,
      'route'     => '/empty_db',
      'args'      => array(
        'methods' => array( 'DELETE' ),
        'callback' => array($this, 'empty_db_cb'),
        'permission_callback' => array( $this, 'is_admin' ),
      ),
    ),

    'get_blogs' => array(
      'namespace' => 'LXB_DBA/' . $version,
      'route'     => '/get_blogs',
      'args'      => array(
        'methods' => array('GET', 'OPTIONS'),
        'callback' => array($this, 'get_blogs_cb'),
        'permission_callback' => array($this, 'is_dba'),
      ),
    ),

    'insert_blogs' => array(
      'namespace' => 'LXB_DBA/' . $version,
      'route'     => '/insert_blogs',
      'args'      => array(
        'methods' => array( 'POST' ),
        'callback' => array($this, 'insert_blogs_cb'),
        'permission_callback' => array( $this, 'is_admin' ),
      ),
    ),

    'get_blogs_from_db' => array(
      'namespace' => 'LXB_DBA/' . $version,
      'route'     => '/get_blogs_from_db',
      'args'      => array(
        'methods' => array( 'GET' ),
        'callback' => array($this, 'get_blogs_from_db_cb'),
        'permission_callback' => array($this, 'is_admin'),
      ),
    ),  

    'get_blog_details' => array(
      'namespace' => 'LXB_DBA/' . $version,
      'route'     => '/get_blog_details',
      'args'      => array(
        'methods' => array( 'GET' ),
        'callback' => array($this, 'get_blog_details_cb'),
        'permission_callback' => array($this, 'is_dba'),
      ),
    ),   

    'update_blogs' => array(
      'namespace' => 'LXB_DBA/' . $version,
      'route'     => '/update_blogs',
      'args'      => array(
        'methods' => array( 'PATCH' ),
        'callback' => array($this, 'update_blogs_cb'),
        'permission_callback' => array($this, 'is_admin'),
      ),
    ),     

  );
}

We don’t need to get into every endpoint’s details, but I want to highlight one thing. First, I provided a function that returns all my endpoints in an array. Next, I wrote a function to loop through the array and register each array member as a WordPress REST API endpoint. Rather than doing both steps in one function, this decoupling allows me to easily retrieve the array of endpoints in other parts of my plugin to do other interesting things with them, such as exposing them to JavaScript. More on that shortly.

Once registered, the custom API endpoints are observable in an ordinary web browser like in the example above, or via purpose-built tools for API work, such as Postman:

JSON output.

PHP vs. JavaScript

I tend to prefer writing applications in PHP whenever possible, as opposed to JavaScript, and executing logic on the server, as nature intended, rather than in the browser. So, what would that look like on this project?

  • On the dashboard site, upon some event, such as the user clicking a “refresh data” button or perhaps a cron job, the server would make an HTTP request to each of the 25 multisite installs.
  • Each multisite install would query all of its blogs and consolidate its analytics data into one response per multisite.

Unfortunately, this strategy falls apart for a couple of reasons:

  • PHP operates synchronously, meaning you wait for one line of code to execute before moving to the next. This means that we’d be waiting for all 25 multisites to respond in series. That’s sub-optimal.
  • My production environment has a max execution limit of 60 seconds, and some of my multisites contain hundreds of blogs. Querying their analytics data takes a second or two per blog.

Damn. I had no choice but to swallow hard and commit to writing the application logic in JavaScript. Not my favorite, but an eerily elegant solution for this case:

  • Due to the asynchronous nature of JavaScript, it pings all 25 Multisites at once.
  • The endpoint on each Multisite returns a list of all the blogs on that Multisite.
  • The JavaScript compiles that list of blogs and (sort of) pings all 900 at once.
  • All 900 blogs take about one-to-two seconds to respond concurrently.

Holy cow, it just went from this:

( 1 second per Multisite * 25 installs ) + ( 1 second per blog * 900 blogs ) = roughly 925 seconds to scrape all the data.

To this:

1 second for all the Multisites at once + 1 second for all 900 blogs at once = roughly 2 seconds to scrape all the data.

That is, in theory. In practice, two factors enforce a delay:

  1. Browsers have a limit as to how many concurrent HTTP requests they will allow, both per domain and regardless of domain. I’m having trouble finding documentation on what those limits are. Based on observing the network panel in Chrome while working on this, I’d say it’s about 50-100.
  2. Web hosts have a limit on how many requests they can handle within a given period, both per IP address and overall. I was frequently getting a “429; Too Many Requests” response from my production environment, so I introduced a delay of 150 milliseconds between requests. They still operate concurrently, it’s just that they’re forced to wait 150ms per blog. Maybe “stagger” is a better word than “wait” in this context:
Open the code
async function getBlogsDetails(blogs) {
  let promises = [];

  // Iterate and set timeouts to stagger requests by 100ms each
  blogs.forEach((blog, index) => {
    if (typeof blog.url === 'undefined') {
      return;
    }

    let id = blog.id;
    const url = blog.url + '/' + blogDetailsEnpointPath + '?uncache=' + getRandomInt();

    // Create a promise that resolves after 150ms delay per blog index
    const delayedPromise = new Promise(resolve => {
      setTimeout(async () => {
        try {
          const blogResult = await fetchBlogDetails(url, id);
                
          if( typeof blogResult.urls == 'undefined' ) {
            console.error( url, id, blogResult );

          } else if( ! blogResult.urls ) {
            console.error( blogResult );
                
                
          } else if( blogResult.urls.length == 0 ) {
            console.error( blogResult );
                
          } else {
            console.log( blogResult );
          }
                
          resolve(blogResult);
        } catch (error) {
          console.error(`Error fetching details for blog ID ${id}:`, error);
          resolve(null); // Resolve with null to handle errors gracefully
      }
    }, index * 150); // Offset each request by 100ms
  });

  promises.push(delayedPromise);
});

  // Wait for all requests to complete
  const blogsResults = await Promise.all(promises);

  // Filter out any null results in case of caught errors
  return blogsResults.filter(result => result !== null);
}

With these limitations factored in, I found that it takes about 170 seconds to scrape all 900 blogs. This is acceptable because I cache the results, meaning the user only has to wait once at the start of each work session.

The result of all this madness — this incredible barrage of Ajax calls, is just plain fun to watch:

PHP and JavaScript: Connecting the dots

I registered my endpoints in PHP and called them in JavaScript. Merging these two worlds is often an annoying and bug-prone part of any project. To make it as easy as possible, I use wp_localize_script():

<?php

[...]

class Enqueue {

  function __construct() {
    add_action( 'admin_enqueue_scripts', array( $this, 'lexblog_network_analytics_script' ), 10 );
    add_action( 'admin_enqueue_scripts', array( $this, 'lexblog_network_analytics_localize' ), 11 );
  }

  function lexblog_network_analytics_script() {
    wp_register_script( 'lexblog_network_analytics_script', LXB_DBA_URL . '/js/lexblog_network_analytics.js', array( 'jquery', 'jquery-ui-autocomplete' ), false, false );
  }

  function lexblog_network_analytics_localize() {
    $a = new LexblogNetworkAnalytics;
    $data = $a -> get_localization_data();
    $slug = $a -> get_slug();

    wp_localize_script( 'lexblog_network_analytics_script', $slug, $data );

  }

  // etc.              
}

In that script, I’m telling WordPress two things:

  1. Load my JavaScript file.
  2. When you do, take my endpoint URLs, bundle them up as JSON, and inject them into the HTML document as a global variable for my JavaScript to read. This is leveraging the point I noted earlier where I took care to provide a convenient function for defining the endpoint URLs, which other functions can then invoke without fear of causing any side effects.

Here’s how that ended up looking:

The JSON and its associated JavaScript file, where I pass information from PHP to JavaScript using wp_localize_script().

Auth: Fort Knox or Sandbox?

We need to talk about authentication. To what degree do these endpoints need to be protected by server-side logic? Although exposing analytics data is not nearly as sensitive as, say, user passwords, I’d prefer to keep things reasonably locked up. Also, since some of these endpoints perform a lot of database queries and Google Analytics API calls, it’d be weird to sit here and be vulnerable to weirdos who might want to overload my database or Google Analytics rate limits.

That’s why I registered an application password on each of the 25 client sites. Using an app password in php is quite simple. You can authenticate the HTTP requests just like any basic authentication scheme.

I’m using JavaScript, so I had to localize them first, as described in the previous section. With that in place, I was able to append these credentials when making an Ajax call:

async function fetchBlogsOfInstall(url, id) {
  let install = lexblog_network_analytics.installs[id];
  let pw = install.pw;
  let user = install.user;

  // Create a Basic Auth token
  let token = btoa(`${user}:${pw}`);
  let auth = {
      'Authorization': `Basic ${token}`
  };

  try {
    let data = await $.ajax({
        url: url,
        method: 'GET',
        dataType: 'json',
        headers: auth
    });

    return data;

  } catch (error) {
    console.error('Request failed:', error);
    return [];
  }
}

That file uses this cool function called btoa() for turning the raw username and password combo into basic authentication.

The part where we say, “Oh Right, CORS.”

Whenever I have a project where Ajax calls are flying around all over the place, working reasonably well in my local environment, I always have a brief moment of panic when I try it on a real website, only to get errors like this:

CORS console error.

Oh. Right. CORS. Most reasonably secure websites do not allow other websites to make arbitrary Ajax requests. In this project, I absolutely do need the Dashboard Site to make many Ajax calls to the 25 client sites, so I have to tell the client sites to allow CORS:

<?php

  // ...

  function __construct() {
  add_action( 'rest_api_init', array( $this, 'maybe_add_cors_headers' ), 10 );
}

function maybe_add_cors_headers() {   
  // Only allow CORS for the endpoints that pertain to this plugin.
  if( $this->is_dba() ) {
      add_filter( 'rest_pre_serve_request', array( $this, 'send_cors_headers' ), 10, 2 );
  }
}

function is_dba() {
  $url = $this->get_current_url();
  $ep_urls = $this->get_endpoint_urls();
  $out = in_array( $url, $ep_urls );
          
  return $out;
          
}

function send_cors_headers( $served, $result ) {

          // Only allow CORS from the dashboard site.
  $dashboard_site_url = $this->get_dashboard_site_url();

  header( "Access-Control-Allow-Origin: $dashboard_site_url" );
  header( 'Access-Control-Allow-Headers: Origin, X-Requested-With, Content-Type, Accept, Authorization' );
  header( 'Access-Control-Allow-Methods: GET, OPTIONS' );
  return $served;
  
}

  [...]

}

You’ll note that I’m following the principle of least privilege by taking steps to only allow CORS where it’s necessary.

Auth, Part 2: I’ve been known to auth myself

I authenticated an Ajax call from the dashboard site to the client sites. I registered some logic on all the client sites to allow the request to pass CORS. But then, back on the dashboard site, I had to get that response from the browser to the server.

The answer, again, was to make an Ajax call to the WordPress REST API endpoint for storing the data. But since this was an actual database write, not merely a read, it was more important than ever to authenticate. I did this by requiring that the current user be logged into WordPress and possess sufficient privileges. But how would the browser know about this?

In PHP, when registering our endpoints, we provide a permissions callback to make sure the current user is an admin:

<?php

// ...

function get() {
  $version = 'v1';
  return array(

    'update_blogs' => array(
      'namespace' => 'LXB_DBA/' . $version,
      'route'     => '/update_blogs',
      'args'      => array(
        'methods' => array( 'PATCH' ),
        'callback' => array( $this, 'update_blogs_cb' ),
        'permission_callback' => array( $this, 'is_admin' ),
        ),
      ),
      // ...
    );           
  }
                      
function is_admin() {
    $out = current_user_can( 'update_core' );
    return $out;
}

JavaScript can use this — it’s able to identify the current user — because, once again, that data is localized. The current user is represented by their nonce:

async function insertBlog( data ) {
    
  let url = lexblog_network_analytics.endpoint_urls.insert_blog;

  try {
    await $.ajax({
      url: url,
      method: 'POST',
      dataType: 'json',
      data: data,
      headers: {
        'X-WP-Nonce': getNonce()
      }
    });
  } catch (error) {
    console.error('Failed to store blogs:', error);
  }
}

function getNonce() {
  if( typeof wpApiSettings.nonce == 'undefined' ) { return false; }
  return wpApiSettings.nonce;
}

The wpApiSettings.nonce global variable is automatically present in all WordPress admin screens. I didn’t have to localize that. WordPress core did it for me.

Cache is King

Compressing the Google Analytics data from 900 domains into a three-minute loading .gif is decent, but it would be totally unacceptable to have to wait for that long multiple times per work session. Therefore I cache the results of all 25 client sites in the database of the dashboard site.

I’ve written before about using the WordPress Transients API for caching data, and I could have used it on this project. However, something about the tremendous volume of data and the complexity implied within the Figma design made me consider a different approach. I like the saying, “The wider the base, the higher the peak,” and it applies here. Given that the user needs to query and sort the data by date, author, and metadata, I think stashing everything into a single database cell — which is what a transient is — would feel a little claustrophobic. Instead, I dialed up E.F. Codd and used a relational database model via custom tables:

In the Dashboard Site, I created seven custom database tables, including one relational table, to cache the data from the 25 client sites, as shown in the image.

It’s been years since I’ve paged through Larry Ullman’s career-defining (as in, my career) books on database design, but I came into this project with a general idea of what a good architecture would look like. As for the specific details — things like column types — I foresaw a lot of Stack Overflow time in my future. Fortunately, LLMs love MySQL and I was able to scaffold out my requirements using DocBlocks and let Sam Altman fill in the blanks:

Open the code
<?php 

/**
* Provides the SQL code for creating the Blogs table.  It has columns for:
* - ID: The ID for the blog.  This should just autoincrement and is the primary key.
* - name: The name of the blog.  Required.
* - slug: A machine-friendly version of the blog name.  Required.
* - url:  The url of the blog.  Required.
* - mapped_domain: The vanity domain name of the blog.  Optional.
* - install: The name of the Multisite install where this blog was scraped from.  Required.
* - registered:  The date on which this blog began publishing posts.  Optional.
* - firm_id:  The ID of the firm that publishes this blog.  This will be used as a foreign key to relate to the Firms table.  Optional.
* - practice_area_id:  The ID of the firm that publishes this blog.  This will be used as a foreign key to relate to the PracticeAreas table.  Optional.
* - amlaw:  Either a 0 or a 1, to indicate if the blog comes from an AmLaw firm.  Required.
* - subscriber_count:  The number of email subscribers for this blog.  Optional.
* - day_view_count:  The number of views for this blog today.  Optional.
* - week_view_count:  The number of views for this blog this week.  Optional.
* - month_view_count:  The number of views for this blog this month.  Optional.
* - year_view_count:  The number of views for this blog this year.  Optional.
* 
* @return string The SQL for generating the blogs table.
*/
function get_blogs_table_sql() {
  $slug = 'blogs';
  $out = "CREATE TABLE {$this->get_prefix()}_$slug (
      id BIGINT NOT NULL AUTO_INCREMENT,
      slug VARCHAR(255) NOT NULL,
      name VARCHAR(255) NOT NULL,
      url VARCHAR(255) NOT NULL UNIQUE, /* adding unique constraint */
      mapped_domain VARCHAR(255) UNIQUE,
      install VARCHAR(255) NOT NULL,
      registered DATE DEFAULT NULL,
      firm_id BIGINT,
      practice_area_id BIGINT,
      amlaw TINYINT NOT NULL,
      subscriber_count BIGINT,
      day_view_count BIGINT,
      week_view_count BIGINT,
      month_view_count BIGINT,
      year_view_count BIGINT,
      PRIMARY KEY (id),
      FOREIGN KEY (firm_id) REFERENCES {$this->get_prefix()}_firms(id),
      FOREIGN KEY (practice_area_id) REFERENCES {$this->get_prefix()}_practice_areas(id)
  ) DEFAULT CHARSET=utf8mb4;";
  return $out;
}

In that file, I quickly wrote a DocBlock for each function, and let the OpenAI playground spit out the SQL. I tested the result and suggested some rigorous type-checking for values that should always be formatted as numbers or dates, but that was the only adjustment I had to make. I think that’s the correct use of AI at this moment: You come in with a strong idea of what the result should be, AI fills in the details, and you debate with it until the details reflect what you mostly already knew.

How it’s going

I’ve implemented most of the user stories now. Certainly enough to release an MVP and begin gathering whatever insights this data might have for us:

Screenshot of the final dashboard which looks similar to the Figma mockups from earlier.
It’s working!

One interesting data point thus far: Although all the blogs are on the topic of legal matters (they are lawyer blogs, after all), blogs that cover topics with a more general appeal seem to drive more traffic. Blogs about the law as it pertains to food, cruise ships, germs, and cannabis, for example. Furthermore, the largest law firms on our network don’t seem to have much of a foothold there. Smaller firms are doing a better job of connecting with a wider audience. I’m positive that other insights will emerge as we work more deeply with this.

Regrets? I’ve had a few.

This project probably would have been a nice opportunity to apply a modern JavaScript framework, or just no framework at all. I like React and I can imagine how cool it would be to have this application be driven by the various changes in state rather than… drumrolla couple thousand lines of jQuery!

I like jQuery’s ajax() method, and I like the jQueryUI autocomplete component. Also, there’s less of a performance concern here than on a public-facing front-end. Since this screen is in the WordPress admin area, I’m not concerned about Google admonishing me for using an extra library. And I’m just faster with jQuery. Use whatever you want.

I also think it would be interesting to put AWS to work here and see what could be done through Lambda functions. Maybe I could get Lambda to make all 25 plus 900 requests concurrently with no worries about browser limitations. Heck, maybe I could get it to cycle through IP addresses and sidestep the 429 rate limit as well.

And what about cron? Cron could do a lot of work for us here. It could compile the data on each of the 25 client sites ahead of time, meaning that the initial three-minute refresh time goes away. Writing an application in cron, initially, I think is fine. Coming back six months later to debug something is another matter. Not my favorite. I might revisit this later on, but for now, the cron-free implementation meets the MVP goal.

I have not provided a line-by-line tutorial here, or even a working repo for you to download, and that level of detail was never my intention. I wanted to share high-level strategy decisions that might be of interest to fellow Multi-Multisite people. Have you faced a similar challenge? I’d love to hear about it in the comments!


WordPress Multi-Multisite: A Case Study originally published on CSS-Tricks, which is part of the DigitalOcean family. You should get the newsletter.

Categories: Designing, Others Tags:

Subscriptions Are Killing Creativity in Design: Why the Industry Must Break Free

November 27th, 2024 No comments

The graphic and web design world was once a sanctuary of creative freedom, where designers wielded their tools with boundless possibilities, limited only by imagination.

But now, a dark cloud looms over this vibrant industry: the relentless rise of subscription-based services. What was sold to us as a convenient, cost-effective model is now suffocating designers, stifling innovation, and forcing us into a perpetual cycle of dependence.

The Subscription Trap

In the past, owning design software was simple. You bought a product, installed it, and it was yours—forever. Upgrades were optional and came at your own pace.

Today, companies like Adobe, Figma, and countless others have restructured their models to lock designers into expensive monthly subscriptions. On the surface, it seems practical: always have the latest tools and updates. But this isn’t a fair trade; it’s a hostage situation.

The numbers tell the story. Adobe’s Creative Cloud subscription starts at $59.99 per month at the time of this writing for access to essential apps like Photoshop, Illustrator, and InDesign. Over five years, that’s almost a staggering $3,600. For freelancers and small studios, it’s a massive financial burden. And if you stop paying? You lose access to everything. All your files, all your tools—gone.

Creativity on a Clock

The subscription model doesn’t just hurt wallets; it punishes creativity. Deadlines and budgets are already stressful, but the looming threat of losing access to essential tools adds another layer of anxiety. Designers are forced into a “pay-to-play” reality where creativity is a service, not a skill. What happens to innovation when the tools of the trade become gated behind a recurring fee?

Even worse, many subscription services now bundle unrelated features into bloated plans, forcing designers to pay for tools they’ll never use. Want just Photoshop? Too bad. You’ll pay for the entire suite, even if you only need one or two applications. It’s the equivalent of being forced to buy a buffet ticket when all you want is a sandwich.

The New Monopoly on Design

Subscriptions also create a dangerous monopoly on creativity. Companies like Adobe, Figma, and Canva dominate the market, making it nearly impossible for independent or smaller competitors to offer alternatives. As designers, our ability to choose is eroding. The tools we use are dictated by industry standards, which are, in turn, dictated by these subscription giants.

When Figma announced its acquisition by Adobe, the collective gasp from designers worldwide wasn’t just about a business deal—it was about the future of affordable, accessible design tools. The writing is on the wall: consolidation and monopolization will leave designers with fewer options and higher costs.

Who Really Benefits?

It’s not the designers. It’s the corporations. Subscription models provide companies with predictable, recurring revenue streams, ensuring their financial security at the expense of their users. They’re no longer incentivized to create groundbreaking new tools; instead, they focus on incremental updates designed to justify the monthly fee. Meanwhile, designers are left paying more for less.

Breaking the Chains

The solution isn’t simple, but it starts with awareness and action. Designers must support alternatives to the subscription model. Open-source software like GIMP, Krita, and Inkscape offers viable, cost-effective options. Companies that still sell perpetual licenses, such as Affinity, deserve our support and advocacy.

Furthermore, we must collectively demand fairer pricing and licensing models. Why can’t companies offer modular subscriptions or rent-to-own options? Designers should be able to pay for the tools they need, not fund a corporation’s endless greed.

Conclusion: A Call to Arms

The graphic and web design community is one of resilience, creativity, and passion. But we cannot afford to let subscription models dictate our futures. It’s time to push back, explore alternatives, and reclaim the tools that allow us to create freely.

Subscriptions aren’t just killing our wallets—they’re killing the very essence of what it means to be a designer. Let’s break the cycle and rediscover the freedom to create.

Categories: Designing, Others Tags: