Netlify just dropped a new thing: Build Plugins. (It’s in beta, so you have to request access for now.) Here’s my crack at explaining it, which is heavily informed from David Well’s announcement video.
You might think of Netlify as that service that makes it easy to sling up some static files from a repo and have a production site super fast. You aren’t wrong. But let’s step back and look at that. Netlify thinks about itself as a platform in three tiers:
Netlify Build
Netlify Dev
Netlify Edge
Most of the stuff that Netlify does falls into those buckets. Connecting your Git repo and letting Netlify build and deploy the site? That’s Build. Using Netlify’s CLI to spin up the local dev environment and do stuff like test your local functions? That’s Dev. The beefed-up CDN that actually runs our production sites? That’s Edge. See the product page for that breakdown.
So even if you’re just slapping up some files that come out of a static site generator, you’re still likely taking advantage of all these layers. Build is taking care of the Git connection and possibly running a npm run build or something. You might run netlify dev locally to run your local dev server. And the live site is handled by Edge.
With this new Build Plugins release, Netlify is opening up access to how Build works. No longer is it just “connect to repo and run this command when the build runs.” There is actually a whole lifecycle of things that happen during a build. This is how David described that lifecycle:
Build Starts
Cache is fetched
Dependencies are installed
Build commands are run
Serverless Functions are built
Cache is saved
Deployment
Post processing
What if you could hook into those lifecycle events and run your own code alongside them? That’s the whole idea with Build Plugins. In fact, those lifecycle events are literally event hooks. Sarah Drasner listed them out with their official names in her intro blog post:
init: when the build starts
getCache: fetch the last build’s cache
install: when the project’s dependencies are installing
preBuild: runs directly before building the functions and running the build commands
functionsBuild: runs when the serverless functions are building, if they exist on the site
build: when the build commands are executing
package: package it to be deployed
preDeploy: runs before the built package is deployed
saveCache: save cached assets
finally: build finished, site deployed ?
To use these hooks and run your own code during the build, you write a plugin (in Node JavaScript) and chuck it in a plugins folder at like ./plugins/myPlugin/index.js
function netlifyPlugin(conf) {
return {
// Hook into lifecycle
finally: () => {
console.log("Finished!")
}
}
}
module.exports = netlifyPlugin
…and adjust your Netlify config (file) to point to it. You’re best off reading Sarah’s post for the whole low-down and example.
Finished my first @Netlify build plugin. Now all I need is my pre-beta access to run a test.
This is the crucial part, right? Kind of the only thing that matters. Having control is great and all, but it only matters if it’s actually useful. So now that we can hook into parts of the build process on the platform itself, what can we make it do that makes our lives and sites better?
Here’s some ideas I’ve gathered so far.
Sitemaps
David demoed having the build process build a sitemap. Sitemaps are great (for SEO), but I definitely don’t need to be wasting time building them locally very often and they don’t really need to be in my repo. Let the platform do it and put the file live as “a build artifact.” You can do this for everything (e.g. my local build process needs to compile CSS and such, so I can actually work locally), but if production needs files that local doesn’t, it’s a good fit.
Notifications
Sarah demoed a plugin that hits a Twilio API to send a text message when a build completes. I do this same kind of thing having Buddy send a Slack message when this site’s deployment is done. You can imagine how team communication can be facilitated by programmatic messaging like this.
Performance monitoring
Build time is a great time to get performance metrics. Netlify says they are working on a plugin to track your Lighthouse score between deployments. Why not run your SpeedCurve CLI thing or Build Tracker CLI there to see if you’ve broken your performance budget?
Why not use the build time to run all your image optimizations? Image Optim has an API you could hit. SVGO works on the command line and Netlify says they are working on that plugin already. I’d think some of this you’d want to run in your local build process (e.g. drop image in folder, Gulp is watching, image gets optimized) but remember you can run netlify dev locally which will run your build steps locally, and you could also organize your Gulp such that the code that does image optimization can build part of a watch process or called explicitly during a build.
Images are a fantastic target for optimzation, but just about any resource can be optimized in some way!
I built a @Netlify build plugin to run subfont on your website. Seems like I don’t have access to the build plugins myself yet, so I’d appreciate it if someone would try it out and give me feedback.https://t.co/PrnL65JSwb
If your build process fails, Netlify already won’t deploy it. Clearly useful. But now you could trigger that failure yourself. What if that performance monitoring didn’t just report on what is happening, but literally killed the build if a budget wasn’t met? All you have to do is throw an error or process.exit, I hear.
Even more baller, how about fail a build on an accessibility regression? Netlify is working on an Axe plugin for audits.
Clearly you could bail if your unit tests (e.g. Jest) fail, or your end-to-end tests (e.g. Cypress) fail, meaning you could watch for 404’s and all sorts of user-facing problems and prevent problematic deploys at all.
Use that build
Netlify is clearly all-in on this JAMstack concept. Some of it is pretty obvious. Chuck some static files on a killer CDN and the site has a wonderfully fast foundation. Some of it is less obvious. If you need server-powered code still, you still have it in the form of cloud functions, which are probably more powerful than most people realize. Some of it requires you to think about your site in a new way, like the fact that pre-building markup is not an all-or-nothing choice. You can build as much as you can, and leave client-side work to do things that are more practical for the client-side to do (e.g. personalized information). If you start thinking of your build process as this powerful and flexible tool to offload as much work as possible to, that’s a great place to start.
Today businesses and organizations are connected to their clients, customers, users, employees, vendors, and sometimes even their competitors. Data can tell a story about any of these relationships, and with this information, organizations can improve almost any aspect of their operations.
Although data can be valuable, too much information is unwieldy, and the wrong data is useless. The right data collection method can mean the difference between useful insights and time-wasting misdirection.
Luckily, organizations have several tools at their disposal for primary data collection. The methods range from traditional and simple, such as a face-to-face interview, to more sophisticated ways to collect and analyze data. Here are the top six methods (we’ll go into more detail about each later):
Interviews
Questionnaires and surveys
Observations
Documents and records
Focus groups
Oral histories
Qualitative vs quantitative data collection methods
Some of the methods covered here are quantitative, dealing with something that can be counted. Others are qualitative, meaning that they consider factors other than numerical values. In general, questionnaires, surveys, and documents and records are quantitative, while interviews, focus groups, observations, and oral histories are qualitative. There can also be crossover between the two methods.
Qualitative data collection methods
Data analysis can take various formats. The method you choose depends on the subject matter of your research.
Quantitative methods, such as surveys, large-scale benchmarks, and prioritization, answer the question “How much?” But these methods can leave the question “Why?” unanswered. This is where qualitative data collection methods come into play.
Understanding qualitative data collection
Qualitative data collection looks at several factors to provide a depth of understanding to raw data. While qualitative methods involve the collection, analysis, and management of data, instead of counting responses or recording numeric data, this method aims to assess factors like the thoughts and feelings of research participants. Qualitative data collection methods go beyond recording events to create context.
With this enhanced view, researchers can
Describe the environment. Understanding where observations take place can add meaning to recorded numbers.
Identify the people involved in the study. If research is limited to a particular group of people, whether intentionally or as a function of demographics or other factors, this information can inform the results.
Describe the content of the study. Sometimes, the specific activities involved in research and how messages about the study were delivered and received may illuminate facts about the study.
Interact with study participants. Interactions between respondents and research staff can provide valuable information about the results.
Be aware of external factors. Unanticipated events can affect research outcomes. Qualitative data collection methods allow researchers to identify these events and weave them into their results narrative, which is nearly impossible to do with just a quantitative approach.
Qualitative research methods
There are three commonly used qualitative data collection methods: ethnographic, theory grounded, and phenomenological.
Ethnography comes from anthropology, the study of human societies and cultures. Ethnography seeks to understand how people live their lives. Through this method, researchers veer away from the specific and practical questions that traditional market researchers use and instead observe the participants in a nondirected way. This approach is intended to reveal behaviors from a subject’s perspective rather than from the view of the researchers.
Ethnography helps fill in the blanks when a participant may not be able to articulate their desires or the reasons for their decisions or behaviors. Instead of, or in addition to, asking why a participant acts a certain way, researchers use observation to understand the why behind these desires, decisions, or behaviors.
Grounded theory arose when sociological researchers sought to provide a level of legitimacy to qualitative research — to ground it in reality rather than assumptions. Before this method, qualitative data analysis was actually done before any quantitative data was collected, so it was disconnected from the collection and analysis process.
Grounded theory uses the following methods:
Participant observation. Researchers immerse themselves in the daily lives of subjects. Another term for this is “fieldwork.”
Interviews. These can vary in formality from informal chats to structured interviews.
Document and artifact collection. Grounded theory often is about more than observation and interviews. Researchers can learn about a group of people from looking at materials the group used. For example, a local community’s laws may shed light on opinions and provide a clearer picture of residents’ sentiments.
Sometimes, a person’s true colors emerge only when they are genuinely put to the test. As such, phenomenology describes how people experience certain events or unique encounters. This method measures reactions to occurrences that are outside of the norm, so it’s essential to understand the whole picture, not just facts and figures.
An example of phenomenology is studying the experiences of individuals involved in a natural disaster. To analyze data from such an event, the researcher must become familiar with the data; focus the analysis on the subject matter, time period, or other factors; and categorize the data.
Completing these tasks gives the researcher a framework for understanding how the natural disaster impacts people. Together, the understanding, focus, and organization help researchers identify patterns, make connections, interpret data, and explain findings.
Each of these qualitative data collection methods sheds light on factors that can be hidden in simple data analysis. Qualitative data is one way to add context and reality to raw numbers. Often, researchers find value in a hybrid approach, where qualitative data collection methods are used alongside quantitative ones.
Quantitative data collection methods
Marketers, scientists, academics, and others may start a study with a predetermined hypothesis, but their research often begins with the collection of data.
Initially, the collected data is unstructured. Various facts and figures may or may not have context. A researcher’s job is to make sense of this data, and the choice of data collection method often helps.
Using data to determine values
One of the most widely used methods of collecting information for research purposes is quantitative data collection. Quantitative analysis relates to evaluating a numerical result. A classic example is a survey, which asks questions to collect responses that shed light on trends, preferences, actions, opinions, and any other element that can be counted.
Quantitative data collection methods are popular because they are relatively straightforward. Using these methods, researchers ask questions to collect sets of facts and figures. Quantitative data is measurable and expressed in numerical form.
While this seems like a fairly simple concept, like many aspects of research, there are various approaches to quantitative data collection that depend on the particular research being conducted.
Descriptive research explains the current status of a variable using observational data collection. Often, the researcher begins without a hypothesis and lets the data steer the direction of the study.
A simple example of quantitative descriptive research is a study that collects and tabulates test scores. Descriptive research frequently uses charts and tables to illustrate results.
Correlational research seeks to collect data that shows relationships between different occurrences. A positive correlation is one in which two variables either increase or decrease at the same time. A negative correlation is when an increase in one variable means a decrease in another.
There is also a zero correlation result, in which the relationship between two variables is insignificant. Correlation helps make predictions based on historical relationships and in determining the validity and reliability of a study.
An example of correlational data would be how a person’s height often correlates to their weight — the taller one gets, usually the heavier they are. This is a positive correlation.
Experimental research, also known as “true experimentation,” uses the scientific method to determine the cause-and-effect relationship between variables. This method uses controls for all of the crucial factors that could potentially affect the phenomena of interest. Using the experimental method, researchers randomly assign participants in an experiment to either the control or treatment groups.
Quasi-experimental research, also known as “causal-comparative,” is similar to experimental research. Since it’s often impossible or impractical to control for all factors involved, quasi-experimental methods don’t control for some factors but otherwise follow the scientific method to establish a cause-and-effect relationship.
In both of these types of studies, independent variables are manipulated. But experimental data collection methods use random assignment and sampling, whereas quasi-experimental methods don’t randomize assignment or sampling or both.
Experimental methods are known for producing results that are both internally and externally valid, meaning that the study is conducted, or structured, well (internal validity) and the findings are applicable to the real world (external validity). Quasi-experimental methods, on the other hand, produce results of questionable internal validity.
Application of quantitative methods in practice
There are a number of ways researchers can put different types of quantitative data collection into action without using experiments.
Quantitative surveys enable researchers to ask closed-ended questions with a provided list of possible answers. This method is easier for respondents, as they just pick from a list of responses. It’s an ideal solution for larger-scale studies that could become unwieldy with the type of open-ended questions often associated with qualitative surveys.
Because the questions and answers are standardized, researchers can use the results to make generalizations. Closed-ended questions, however, can be limiting. A respondent may not see their answer in the given choices.
Quantitative interviews are typically conducted face to face, over the phone, or via the internet. They enable researchers to not only collect information but also tailor the questions to the audience on the spot. This can help add some “why” to the “how much” collected through quantifiable means.
What are secondary data collection methods?
Since most research involves the collection of data, there are several methods for direct, or primary, data collection, including surveys, questionnaires, direct observations, and focus groups.
While primary data collection is considered the most authoritative and authentic data collection method, there are several instances where secondary data collection methods can provide value.
Understanding secondary data collection
What is secondary data collection, and why would a researcher employ it in addition to primary data? Think of secondary data as second-hand data. It’s someone else’s research, another person’s original bank of knowledge.
Second-hand data can add insight to a research project, and using secondary data is more efficient and less expensive than collecting primary data. So how can someone else’s research be valuable to your independent study? Answering this question involves understanding how a lot of research is initiated today.
The role of the government in statistical research
For a variety of reasons, lots of governmental entities and agencies collect demographic and other information on people. Governments collect data through various means, sometimes as part of other activities. The census is a primary example of valuable governmental primary data collection that can be used as a secondary data collection method in other research studies.
Several nonprofit and governmental entities specialize in collecting data to feed the efforts of other researchers.
Public sources beyond the U.S. Census Bureau include
State and Metropolitan Area Data Book
Statistical Abstract of the United States
U.S. Industry and Trade Outlook
U.S. Government Printing Office
Small Business Administration
Local chambers of commerce
County governments
Other sources of secondary data
While governments are sources of useful information, they aren’t the only suppliers of secondary data. Commercial sources include research and trade associations, such as banks, publicly traded corporations, and others.
Educational institutions are also reliable sources of secondary data. Many colleges and universities have dedicated research arms that leverage data for educational purposes. This data can often assist others in unrelated studies.
The value of secondary data
There is more to secondary data than the fact that it is cheaper than primary data; however, cost is a major reason why this data is used. If the information you need is already available, it simply makes sense to use it rather than to replicate it.
Sometimes primary data is unnecessary for a particular research goal. You should first determine whether or not your research questions have already been asked and answered. If so, you can devote your data collection budget to expand on what has already been determined through other unrelated projects.
The cost of collecting primary data can be considerable. While using secondary data is cheaper, it also saves time. Time has a value of its own in research, allowing for greater emphasis on studying results.
Ultimately, using secondary data saves time and money, which facilitates a more in-depth study of the subject. Combined with primary research, secondary data can help researchers better understand their subjects and more efficiently prepare and organize results.
Top 6 data collection methods
Interviews
If you asked someone completely unaware of data analysis how to best collect information from people, the most common answer would likely be interviews.
Almost anyone can come up with a list of questions, but the key to efficient interviews is knowing what to ask. Efficiency in interviewing is crucial because, of all the primary data collection methods, in-person interviewing can be the most expensive.
There are ways to limit the cost of interviews, such as conducting them over the phone or through a web chat interface. But sometimes an in-person interview can be worth the cost, as the interviewer can tailor follow-up questions based on responses in a real-time exchange.
Interviews also allow for open-ended questions. Compared to other primary data collection methods, such as surveys, interviews are more customizable and responsive.
Observation
Observation involves collecting information without asking questions. This method is more subjective, as it requires the researcher, or observer, to add their judgment to the data. But in some circumstances, the risk of bias is minimal.
For example, if a study involves the number of people in a restaurant at a given time, unless the observer counts incorrectly, the data should be reasonably reliable. Variables that require the observer to make distinctions, such as how many millennials visit a restaurant in a given period, can introduce potential problems.
In general, observation can determine the dynamics of a situation, which generally cannot be measured through other data collection techniques. Observation also can be combined with additional information, such as video.
Documents and records
Sometimes you can collect a considerable amount of data without asking anyone anything. Document- and records-based research uses existing data for a study. Attendance records, meeting minutes, and financial records are just a few examples of this type of research.
Using documents and records can be efficient and inexpensive because you’re predominantly using research that has already been completed. However, since the researcher has less control over the results, documents and records can be an incomplete data source.
Focus groups
A combination of interviewing, surveying, and observing, a focus group is a data collection method that involves several individuals who have something in common. The purpose of a focus group is to add a collective element to individual data collection.
A focus group study can ask participants to watch a presentation, for example, then discuss the content before answering survey or interview-style questions.
Focus groups often use open-ended questions such as, “How did you feel about the presentation?” or “What did you like best about the product?” The focus group moderator can ask the group to think back to the shared experience, rather than forward to the future.
Open-ended questions ground the research in a particular state of mind, eliminating external interference.
Oral histories
At first glance, an oral history might sound like an interview. Both data collection methods involve asking questions. But an oral history is more precisely defined as the recording, preservation, and interpretation of historical information based on the opinions and personal experiences of people who were involved in the events.
Unlike interviews and surveys, oral histories are linked to a single phenomenon. For example, a researcher may be interested in studying the effect of a flood on a community. An oral history can shed light on exactly what transpired. It’s a holistic approach to evaluation that uses a variety of techniques.
As in interviewing, the researcher can become a confounding variable. A confounding variable is an extra, unintended variable that can skew your results by introducing bias and suggesting a correlation where there isn’t one.
The classic example is the correlation between murder rates and ice cream sales. Both figures have, at one time or another, risen together. An unscientific conclusion may be that the more people buy ice cream, the higher the occurrence of murder.
However, there is a third possibility that an additional variable affects both of these occurrences. In the case of ice cream and murder, the other variable is the weather. Warmer weather is a confounding variable to both murder rates and ice cream sales.
Questionnaires and surveys
Questionnaires and surveys can be used to ask questions that have closed-ended answers.
Data gathered from questionnaires and surveys can be analyzed in many different ways. You can assign numerical values to the data to speed up the analysis. This can be useful if you’re collecting a large amount of data from a large population.
To be meaningful, surveys and questionnaires need to be carefully planned. Unlike an interview, where a researcher can react to the direction of a respondent’s answers, a poorly designed questionnaire will lead the study nowhere quickly. While surveys are often less expensive than interviews, they won’t be valuable if they aren’t handled correctly.
Surveys can be conducted as interviews, but in most cases, it makes sense to conduct surveys using forms.
Online forms are a modern and effective way to conduct surveys. Unlike written surveys, which are static, the questions presented in online forms can change according to how someone responds. For instance, if you use JotForm to create your forms, when someone answers no to a question about allergies, they won’t have to scroll past all of the related follow-up questions about specific allergies. Instead, they’ll go immediately to a question on a different topic.
You can use JotForm’s collection of form templates instead of creating a form from scratch. All you need to do is choose a template and customize it for your needs.
Modern form building also emphasizes mobile data collection, so the forms can easily be viewed and filled out on mobile devices. One concern when gathering data electronically in the EU is the European Union’s General Data Protection Regulation (GDPR). This newly enacted regulation provides privacy protection to EU residents and citizens and can result in costly fines for noncompliance. If you want to learn more about how to make sure your forms are GDPR compliant, JotForm has all the information you need.
Sampling methods in data collection
Imagine that your business serves a substantial population. Maybe you have a massive customer list (which most businesses would love), or you’re trying to gain some insights on a large group, such as the residents of a large city. In most cases, it’s impractical to try to reach each member of this population.
Sampling is the process of identifying a subset of a population that provides an accurate reflection on the whole. It can be a tricky process, as populations are often diverse. However, there are some statistical methods that can make sure a small subset of the community accurately represents the whole group.
There are five generally accepted sampling methods. Below is an overview of these methods, the pros and cons of each, and how they can be put to work in your research.
Random sampling
Just as its name indicates, random sampling involves picking respondents with no design or order, like picking names out of a hat. While randomness may seem unscientific, this method can be valuable in research, and in fact, is the preferred way of sampling, as a truly random sample eliminates elements that can affect the validity of a study.
Randomness requires some planning. For example, randomly picking pedestrians in Manhattan’s Times Square on a Saturday afternoon will give the researcher a reasonably diverse cross-section of tourists. This would not, however, be an excellent way to test native New Yorkers who often shun the area, especially on weekends.
Computer-generated lists can aid in achieving randomness.
Systematic sampling
Easier than random sampling, systematic sampling follows a set of rules to create regularity in sampling. An example is interviewing every tenth customer. As long as you follow the counting system, you’ll know that there’s some order to the process.
Systematic sampling retains some of the benefits of randomness, but it can be too rigid in cases where, for example, the researcher knows that the counting system is going to result in data that skews one way or another — for instance, if someone is interviewing every tenth customer and every tenth person winds up being a female in the same age range.
Convenience sampling
This is the easiest sampling method but also the least reliable. Convenience sampling involves gathering information from whoever is closest and easiest to reach. An example would be asking coworkers in the same office a question, rather than questioning every employee at the company, since going to other offices might take more time and effort. Convenience sampling can also involve using whatever data is readily available.
Sometimes, convenience sampling can be effective, such as to gain initial primary data on brand impressions or product redesigns, where participant diversity or inclusion criteria may be less important.
Clustered sampling
With clustered sampling, a researcher uses the subgroups of a population instead of individuals. Clusters are often predefined, such as municipalities in a study about the effect of a particular phenomenon across the country.
Clustered sampling is further broken down into different types — single-stage cluster sampling, where all individuals in a cluster are included in the sample, or two-stage cluster sampling, where only random individuals within the cluster are chosen.
The main benefit of clustered sampling is that some of the work is already done: A group is already clearly defined. Therefore, it can be more efficient than other methods. However, there can be bias in the study if the clusters do not accurately represent the population as a whole.
Stratified sampling
Another method that uses subgroups is stratified sampling. This data collection method involves dividing a population into subgroups that share similar characteristics.
For example, a study can break respondents down by gender or age. When the components are easy to determine, like gender or age, the risk of bias is low, especially if the data comes from the respondents. Stratified sampling reduces bias, but sometimes characteristics are difficult to ascertain, which can either frustrate the sampling process or invite bias.
Stratified and cluster sampling may sound similar. Here’s the critical difference: In stratified sampling, individuals are randomly selected from each group (or strata). In cluster sampling, only certain clusters are used.
Data collection and lead generation
There are many methods businesses can use to collect and analyze customer information.
The key to gaining more customers through market research is to turn data collection into lead generation. The ideal result is not only generating more leads but recognizing the opportunities that are likely to turn into increased sales.
Understanding lead generation
Leads are the fuel of a company’s sales engine. In the case of business-to-business activity, a sales lead is a person or entity that has the potential to convert to a customer or client. For businesses that have a large footprint and enjoy wide brand recognition, leads can be broadly defined as most members of a community. This is the case with B2C companies such as Walmart and McDonald’s, which often spend much less time and effort reaching their audience as a result.
Leads are identified through a variety of methods, data collection included. With more and more individuals and businesses using the internet to shop for virtually every possible product and service, online sources of lead generation are quite popular and effective. Almost 90 percent of all Americans use the internet, and the small percentage of those who don’t use it in their daily lives are typically older adults.
Leveraging online interactions to power lead generation
How do businesses successfully tap into the large percentage of people online? One of the most effective ways to turn engagement with the public into valuable leads is by using forms. Forms allow businesses to collect critical data from potential customers, such as name, email address, industry, job title, and more.
While it’s easy to understand how a form can collect this sort of information, the key step is engaging potential customers and getting them to provide the information you need. One particularly effective method is to give potential customers something of value in exchange for their information.
Such an incentive can be a discount code for becoming a member, a free downloadable checklist, an e-book, or a white paper. This gives your potential customers a positive interaction and association with your brand and provides you with the tools needed to target them at a later date.
Gated content to capture potential
Using a content gate can help you collect the information you want. A content gate requires your potential customers to provide you with valuable contact information in order to access the material, discount codes, or other things of value you’re offering. You can also track the content for later use. Without a content gate, someone can view the content, then move on without ever being reminded about the benefits of your products or solutions.
Controlling the lead-generation process
Using forms for downloadable content is an effective method for gathering high-quality leads, However, there is a risk of getting inaccurate information.
For example, a potential customer in the early stages of shopping for products and services may be less likely to use their email address because of spam fears. This can lead to people creating separate email accounts for signing up for these sorts of downloads or entering false email addresses to bypass a system.
There are several ways to combat the potential disruption of your lead-generation campaign. First, systems that use technology, such as CAPTCHA, verify that an email account is valid and ensure that your site isn’t getting bombarded by bots.
Another method is Internet Protocol (IP) limiting, which restricts the number of downloads available to users from the same IP address. Businesses can set a threshold number of downloads, and the system will disable access after that number has been reached or after a certain time frame.
More lead-generation tools
In addition to using gated forms to collect lead data, businesses can use social media and Search Engine Optimization (SEO). Social media helps you to establish your brand as an authority and to connect with and engage current and prospective customers. Optimizing your website can help put your brand in front of more internet searchers.
Whether you employ forms, social media, SEO, or a combination of methods, it all starts with a lead-generation strategy that’s tailored to your business and your potential customers.
Data collection through online forms
Collecting data with the Likert scale
Even those unfamiliar with data collection and research have probably had a brush with the Likert scale. It’s a series of questions or items that call for respondents to provide an answer on a scale — for example, a range of one through six, with one meaning strongly disagree and six meaning strongly agree with a given statement.
If you’re conducting surveys, you need to know how to analyze Likert scale data.
The Likert scale measures attitudes, which can be a helpful indicator for businesses looking to gauge customer opinions on products, services, and more. These scales usually contain five to seven points and follow a linear pattern.
JotForm makes incorporating Likert scales into surveys and questionnaires easy.
What is informed consent?
Obtaining consent is an important component of online data collection. JotForm’s consent form templates allow users to create custom consent forms for a variety of situations. These include photo releases, parental consent, authorization to release, and other forms.
Healthcare providers in particular need to obtain informed consent before collecting data. In order for informed consent to be valid, four elements generally need to be present:
The competency to make the decision. Competency is a legal term that’s decided by a court. A person’s capacity to make a decision isn’t black and white. Some decisions may be within a person’s ability to make, while others may fall outside of their ability. Adults are presumed to have the capacity to make informed consent decisions.
Disclosure by the medical provider of expected benefits and risks. This is the informed part of the informed consent concept, and it places the burden on the provider to fully inform someone before asking for their consent. How much information do you have to disclose? The answer is adequate information to make an informed decision. This likely means disclosing all of the potential risks and benefits of a procedure, medication, etc., and the likelihood of side effects or other adverse reactions.
The person providing consent must understand the information. This element is similar to capacity but can involve breaking down language barriers or other factors that might impede understanding beyond competency.
Consent must be given voluntarily. This means that the permission is given of one’s free will, by choice, and not through force or compulsion. Factors that affect whether consent is voluntary can include subtle elements such as socioeconomic status or obvious ones such as threats or coercion.
Each of these elements can be part of an online consent form. Detailed disclosures and questions can identify whether a component is lacking.
Why is informed consent important?
Without informed consent, your data will be invalid. Results from unknowing participants have less value and are often unusable. With JotForm, you can ensure that you properly document consent, making your job as a researcher, business owner, or healthcare provider easier.
Survey feedback forms
Another area of interest for researchers is feedback on a survey or research method. Follow-up questions through a feedback form can shed light on a study. For example, if respondents don’t understand the questions in a survey, their feedback can help a researcher understand the reason for poor results or identify potentially inaccurate findings. Feedback can also help fuel process improvements in a business.
With JotForm, companies can easily solicit feedback from clients. While individual business activities may guide specific feedback questions, some of the best feedback is simple and client-driven, and offers the client an opportunity to provide a narrative.
Feedback can also incorporate some of the other elements of data collection, such as a Likert scale, to gauge satisfaction.
With these tools and forms, businesses can turn qualitative data into quantitative data and draw meaningful conclusions from it. For example, feedback provides qualitative data, such as how a participant felt after a study’s conclusion. This emotional data can be scored and assigned a value, which then can be used to gauge the effectiveness of a particular component of a study.
Accessing the data
One of the main benefits of using JotForm is the ability to easily tap into data analytics. JotForm’s report section presents the collected data in a clear format, eliminating extra steps and making data analysis more efficient and cost-effective.
From collecting data to using various methods to simply and efficiently gauge attitudes and manage consent and feedback, JotForm provides tools that enable better research.
JotForm as a data collection tool
Data can provide insight into customers, processes, and employees; it can often give companies the leverage they need to gain a significant competitive advantage.
While information can drive innovation, businesses need to find efficient methods of collecting and organizing data. Just as many technological tools, data is only beneficial when it’s used wisely; data doesn’t deliver miracles to businesses but instead provides an opportunity for improvement.
Get the right data at the right time
JotForm helps organizations collect the data they need by automating complex tasks, including collecting customer info and other data, as well as collecting payments, such as subscription fees and donations.
JotForm’s online Form Builder makes creating professional-looking forms easy with options to customize the layout and branding of forms.
JotForm can save you time and effort. Form templates, themes, widgets, and integrations give your forms more power and drive better results.
JotForm gives forms life across a variety of industries
There are many ways to use JotForm, but a few key examples illustrate the benefits of using online forms to collect data. These examples show how important it is to know which data to collect and some surprising ways different industries use data.
Data collection in the classroom
Educational institutions can use data collection methods to help propel learning. Long before computers and advanced analytics, teachers tallied quiz and test scores and analyzed the results not just for reporting purposes but also to identify trends in a particular class and help guide the learning process.
These goals remain, but with new tools, teachers and administrators can go deeper. Data from different sources, such as attendance records and homework scores, for example, provides opportunities for richer insights and the ability to identify correlations that might not be visible at first glance.
This data has already provided direction to educational institutions. EdTech tools such as learning software and digital games generate extensive data that can gauge student learning.
Educational data collection also shows the importance of not merely collecting information but tailoring it to the needs of students. And of course, input from teachers is essential for effective data collection.
Summer camps benefit from form automation
Summer camps are both similar to and different from educational institutions. Some of their data collection needs are educational, while others aren’t.
Camps have a few challenges, such as getting to know campers quickly to make sure they start benefiting from the experience right away. A form creator can help make information collection easier and quicker.
Sports camps, for example, may want to know specific, measurable data such as recent statistics about or skill levels of the campers. Tailored forms help collect the information you need.
Camps also need payment forms, contact information forms, consent forms, and health forms. Forms help summer camps get all the pertinent information needed for a successful camp. Automation and tracking help ensure nothing falls through the cracks.
Forms can help professional photographers present their portfolios and gather customer data
Like many business owners, professional photographers deal with a lot of paperwork. From appointment and session agreement forms to payment collection and model release forms, JotForm allows photographers to focus on their clients, not on collecting information and getting forms signed.
Forms help art galleries collect data
Art galleries benefit from JotForm’s data collection features too. The art business has unique challenges. Sales tend to be larger-ticket items with less demand, which makes data-driven variables such as payments, event registration, and feedback crucial. Since customers are a discerning crowd, art galleries often spend more time developing relationships than making individual sales.
These relationships can be fleshed out with customer data on likes, dislikes, previous pieces purchased, and any other data that can help direct sales and marketing efforts more efficiently. If you already know that customer A is fond of photography, then you can be sure to target that customer with marketing materials about photography.
JotForm can also help manage another unique component of the business of art: managing submissions by artists. It’s critical to gather accurate information from artists, not only for customers but also to plan and develop seasonal exhibitions. JotForm provides a communication and tracking solution so that artists and galleries can collaborate effectively. Artists and galleries can log in and quickly see what information has been submitted, and export information to a shareable PDF with JotForm’s free PDF Editor.
Provide more responsive healthcare with JotForm
Most people are aware of the small mountain of paperwork required in healthcare today. JotForm helps speed up new patient onboarding and other form-heavy processes that can make patients feel neglected.
With easily fillable forms, you can direct patients to a simple portal where they can upload their information in advance. Not only is this a time-saver, but it can also help identify potential problems, such as preexisting conditions and possible drug interactions beforehand.
These are just a few of the many examples that demonstrate how JotForm can fuel efficiencies in organizations, resulting in better service and happier customers.
(This is a sponsored article.) You know how critical it is to build websites that load quickly. All it takes is for a page to load one second too long for it to start losing visitors and sales. Plus, now that Google has made mobile-first indexing the default, you really can’t afford to let any performance optimizations fall by the wayside what with how difficult it can be to get your mobile site as speedy as your desktop.
Google takes many factors into account when ranking a website and visitors take maybe a handful of factors into account when deciding to explore a site. At the intersection of the two is website speed.
It should come as no surprise that images cause a lot of the problems websites have with speed. And while you could always just trim the fat and build more minimally designed and content-centric sites, why compromise?
Images are a powerful force on the web.
Not only can well-chosen images improve the aesthetics of a site, but they also make it easier for your visitors to consume content. Of course, there are the SEO benefits of images, too.
So, today, let’s focus on how you can still design with as many images as you want without slowing down your website. This will require you to update your image optimization strategy and adopt a tool called ImageKit, but it shouldn’t take much work from you to get this new system in place.
The Necessity Of An Image Optimization Strategy For Mobile
The median size of a desktop website in 2019 is 1939.5 KB.
The median size of a mobile website in 2019 is 1745.0 KB.
If we don’t get a handle on this growth, it’s going to be impossible to meet consumer and Google demands when it comes to providing fast websites. That or we’re going to have to get really good at optimizing for speed.
Speaking of speed, let’s see what HTTP Archive has to say about image weight.
As it stands today:
The median size of images on desktop is 980.3 KB out of the total 1939.5 KB.
The median size of images on mobile is 891.7 KB out of the total 1745.0 KB.
Bottom line: images add a lot of weight to websites and consume a lot of bandwidth. And although this data shows that the median size of images on mobile is less than their desktop counterparts, the proportion of images-to-website is slightly larger.
That said, if you have the right image optimization strategy in place, this can easily be remedied.
Here is what this strategy should entail:
1. Size Your Images Correctly
There are lots of tedious tasks you’d have to handle without the right automations in place. Like resizing your images.
But you have to do it, right?
Let’s say you use Unsplash to source a number of images for a website you’re working on.
Unlike premium stock repositories where you might get to choose what size or file format you download the file in, you don’t get a choice here.
So, you download the image and any others you need. You then have the choice to use the image as is or manually resize it. After looking at the size of the file and the dimensions of the image, you realize it would be a good idea to resize it.
This particular image exported as a 3.6 MB file and a 5591×3145 px image. That’s way too big for any website.
There’s no reason to upload images larger than 1 MB — and that’s even pushing it. As for dimensions? Well, that depends on the width of your site, but I think somewhere between 1200 and 2000 px should be your max.
You’re going to have to go through this same process whether images come from a stock site or from someone’s DSLR. The point is, no source image is ever going to come out the “right” size for your website, which means resizing has to take place at some point.
What’s more, responsive websites display images in different sizes depending on the device or browser they’re viewed on. And then there are the different use cases — like full-sized image vs. thumbnail or full-sized product photo vs. featured image.
So, there’s more resizing that has to be done even after you’ve gone through the trouble of manually resizing them.
Here’s what you shouldn’t do:
Resize images one-by-one on your own. It’s time-consuming and inefficient.
Rely on browser resizing to display your images responsively as it can cause issues.
Instead, you can integrate your existing image server (on your web host) or external storage service (like S3) with ImageKit. Or you can use ImageKit’s Media Library to store your files.
As you can see, ImageKit has accepted the upload of this Unsplash photo at its original dimensions and sizes. The same goes for wherever your files originate from.
However, once you integrate your images or image storage with ImageKit, the tool will take control of your image sizing. You can see how that’s done here:
Let me briefly explain what you’re looking at above:
The Image Origin Preference tells ImageKit where images need to be optimized from. In this case, it’s the ImageKit Media Library and they’ll be served over my website.
The Old Image URL is a reminder of where our images lived on the server.
The New Image URLs explains where your images will be optimized through ImageKit.
The formula is simple enough. You take the original URL for your image and you transform it with the new ImageKit URL.
The ImageKit URL alone will instantly shrink the size of your image files. However, if you want to do some resizing of your image’s dimensions while you’re at it, you can use transformation parameters to do so.
For example, this is the Unsplash photo as seen from the media library of my website. It lives on my own servers, which is why the address shows my own URL:
To see what it looks like once ImageKit has transformed it, I swap out my domain name with the endpoint provided by ImageKit. I then add my image resizing parameters (they allow you to do more than just resize, too) and reattach the remainder of the URL that points to my image storage.
This is what happens when I use ImageKit to automatically resize my image to 1000×560 pixels:
To create this resized image, I transformed the ImageKit URL into the following:
https://imagekit.io/vq1l4ywcv/tr:w-1000,h-560/…
It’s the width (w-) and height (h-) parameters that reduced the file’s dimensions.
Now, as you can see, this isn’t as pixel-perfect as the original image, but that’s because I have quite a bit of compression applied to the file (80%). I’ll cover how that works below.
In the meantime, let’s focus on how great the image still looks as well as the gains we’re about to get in speed.
Previously, this was a 3.6 MB file for the 5591×3145 px image. Now, it’s a 128 KB file for the 1000×560 px image.
To sweeten the deal further, ImageKit makes it easy to resize your images this way using URL-based image transformation. Essentially, it works like this:
You save one master image to ImageKit’s media library or your preferred server.
ImageKit automatically uses multiple techniques to bring down the image size significantly.
You can then use ImageKit’s resizing and cropping parameters to modify each image to cater to different device resolutions and sizes.
When 91mobiles took advantage of this form of image optimization, it saved its website 3.5 TB every month of bandwidth. And they didn’t have to do anything but integrate with the platform. There was no need to move their images to ImageKit or another third-party storage service. It all took place within their legacy infrastructure.
2. Use Faster-loading Image Formats
It’s not just the size of your images that drain storage space and bandwidth. The file types you use have an impact, too.
PNGs, in general, are used for things like logos, images containing text and other super-fine images that have a transparent background. While you can use them to save your photos, they tend to produce the largest sizes. Even when lossless compression is applied, PNGs still remain larger in size than other file types.
GIFs are the animated counterpart of PNGs and use lossless compression as well.
JPGs, on the other hand, are best suited for colorful images and photos. They’re smaller in size and they shrink down with lossy compression. It’s possible to compress JPGs enough to get them to a manageable size, but you have to be careful as lossy compression degrades the overall quality of a file and there’s no turning back once it’s been done.
WebPs have been gaining in popularity since Google introduced them in the early 2010s. According to a Google study, WebPs can be anywhere between 25% and 34% smaller than JPGs. What’s more, you can use both lossy and lossless compression on WebPs to get them down to even smaller sizes.
Something to keep in mind with WebPs is that they’re not universally accepted. As of writing this, WebPs aren’t accepted by iOS devices. However, the latest versions of all other browsers, Google or otherwise, will gladly display them.
As for how ImageKit helps with this, it’s simple really:
When this setting is configured, ImageKit automatically determines the best file format to deliver each of your files in. It takes into account what the original image format and content was along with whether or not the visitor’s device supports it.
JPGs, PNGs and GIFs will all be converted into WebPs when possible — say, if the visitor visits from Chrome (which accepts them). If it’s not possible — say, if the visitor visits from Safari (which doesn’t accept them) — ImageKit will convert to the best (i.e. smallest) format with the defined transformations. This might be a PNG or JPG.
Nykaa was able to capitalize on this image optimization strategy from ImageKit. Even though their website had already been designed using a mix of JPGs and PNGs and were stored in a number of places around the web, ImageKit took care of automating the image formats right from the original URLs.
3. Compress Images
Next, we need to talk about image compression. I’ve already referenced this a couple times, but it breaks down to two types:
Lossless
This form of compression is used on PNGs and GIFs. To compress the file, metadata is stripped out. This way, the integrity of the image remains intact, but the file shrinkage isn’t as substantial as you’d get with lossy compression.
Lossy
This form of compression is applied to JPGs and WebPs. To compress the file, some parts of the image are “lost”, which can give certain spots a granier appearance than the original image. In most cases, it’s barely noticeable unless you look closely at a side-by-side of the two images. But to your visitors, the degradation is easy to miss since there’s no original to compare against.
With lossy compression, you get to control what percentage of the file degrades. A safe range would be anything over 70% to 80%. ImageKit, by default, sets its optimization for 80% and it estimates that you can save at least 20% to 25% of your file size just from that. In reality, though, it’s probably more (we’re looking at upwards of 40% like in the Unsplash image example above):
You can change this to whatever default you believe will maintain quality while giving you the image sizes that help your site load quickly.
Whether you use the default or your own optimization setting, remember to switch on the additional compression settings available under the Advanced tab.
These three settings, in particular, will enable you to do as much compressing and as safely as possible.
The first setting “Save a Copy”, for instance, keeps your original images on the ImageKit server. That way, you have a copy of the image pre-compression without having to manage the burden of it on your own server.
The second setting “Preserve Image Metadata” enables you to apply lossless compression when feasible.
And the last setting “PNG Image Compression Mode” allows you to decide what level of lossless optimization you want to use on your PNGs: maximum, minimum or none.
When done, you’ll end up with results like this side-by-side comparison:
This is a JPG from Unsplash. Can you tell which is the original and which is the compressed and resized version from ImageKit?
The one on the left with the black trim is:
1500×1005 px
266 KB
Compressed at 95%
The one on the right with the white trim is:
5444×3649 px
2.5 MB
Original
It’s up to you to decide which of the ImageKit compression and optimization settings you’re most comfortable using and then configure accordingly.
4. Save to and Pull Images from External Server
There are two ways to run images through ImageKit.
The first is by uploading your images directly to its Media Library:
The second is by integrating with your website or external storage service. We’ve actually already seen this part of ImageKit. It’s where you get your URL endpoints from so you can define your image parameters:
Even with all of the optimizations above, you might still be having a hard time with image storage and maintenance — either because of how they affect your speed or how much storage you have to hold them.
For instance, if you store your images on your server, you’ll eventually be constrained for space (unless you have a monster-sized hosting account).
When you’re building massive e-commerce stores or business websites with thousands or even millions of images and corresponding image sizes, you can’t afford to be hosting those images on your own. Granted, there is a way to serve them more quickly to visitors (which I’ll explain in the next point), but why take on the burden and cost of additional storage if you don’t have to?
5. Add a CDN
A CDN is another essential optimization tool for large repositories of images. Think of it like a second server, only this one caches (copies) your website and serves them through data centers located significantly closer to your visitors around the world.
As a result, the time it takes to send your website and its thousands of product images from New York, New York to Bangladesh, India happens insanely fast.
With ImageKit, you get to enjoy the privilege of serving your images not just through its core processing servers, but through AWS CloudFront CDN (included in all plans) which has over 150 locations worldwide.
Sintra, a client of ImageKit, saw a big leap in performance after moving to ImageKit. With the ImageKit image CDN (which has delivery nodes all around the globe), it saw an 18% drop in page load times.
Wrapping Up
What’s especially nice about ImageKit is that it’s not just a preventative measure against slowdowns caused by images. You can use it to retroactively fix and improve mobile websites and PWAs, even if they already have millions of images on them. What’s more, the performance center makes it easy to keep an eye on your website’s images and identify opportunities for speed enhancements.
Plus, as you can see from the tips above, ImageKit has simplified a lot of the work you’d otherwise have to do, whether you’d handle it manually or configure it through a plugin.
With consumers and Google becoming pickier by the day about how quickly websites load on mobile, this is the kind of image optimization strategy you need. It’ll lighten your load while ensuring that any images added before or after ImageKit are optimized to the fullest. Even better, your clients will reap the benefits of more leads and greater conversions.
There was a time when collecting data on the go involved a lot of preparation, manual tasks, and sprints to places with a stable internet connection when you were in a pinch.
But that is changing as remote work becomes more common and people have access to the internet in more places.
Consider the 2017 Samsung-commissioned study that surveyed 1,205 people who spent at least a third of their day working remotely, worked at companies with at least 100 employees, and had to own a smartphone.
The study yielded a number of eyebrow-raising findings across six groups of jobs, including site workers, field service workers, drivers, public safety workers, and healthcare workers:
Ninety-three percent used a smartphone for work every day.
They spent close to 33 percent of their day using cell phones at work.
The top three tasks they performed most frequently were emailing, texting, and making calls.
They viewed or edited documents about five times a day and accessed business applications and virtual desktops with the same frequency.
If they could use only one device at work and had to choose between a cell phone or traditional computer, 42 percent opted for their smartphone.
Forty percent said smartphones have replaced the need for a traditional computer or will do so in a few years.
Software companies have noticed and are responding by releasing new apps every year that give mobile phones the same capabilities as traditional computers, if not more. That means remote workers can access the same products, features, and services, regardless of whether they’re working in an office or conducting fieldwork.
Offline features in mobile apps have even made it easy for people to access, view, collect, and share information anywhere, even if they can’t access the internet.
With so many options out there, how do you decide which one to use? We’ll take the guesswork out of finding the right mobile data collection app.
JotForm, long known as the easiest online form builder, is making a big splash by changing the conversation around mobile forms.
The form builder’s newest app, JotForm Mobile Forms, allows users to build, view, access, sort, fill out, share, and organize all data in a single place. This eliminates the need to use several siloed solutions in order to gather information, share it, and get work done.
JotForm users can even download a PDF copy of submitted form information and add form respondents as contacts in their phone.
The mobile app contains form fields that let people scan a QR code, provide their geolocation information, record a voice message, and add their electronic signature as they fill out a form. With this information, JotForm users can not only collect more specific data but also make decisions on the fly.
The app also has an offline feature that allows users to collect the information they need when there’s limited or no connection to the internet. All the information collected offline is saved and synced to a JotForm user’s account once a stable Wi-Fi or wireless data connection is available. That means JotForm users can stop searching for a Wi-Fi connection and worrying about losing data that’s submitted when there’s no internet available.
The handy kiosk mode feature allows app users to turn their mobile device into a public survey station. Kiosk mode pulls up a form, locks it in full screen so no other applications are visible, and refreshes a form after each submission. This, in turn, creates an uninterrupted experience so different people can fill out the same form at trade shows or conventions, for instance.
JotForm users can even assign forms to team members who don’t have an account but need to access a form securely, fill it out, open it in kiosk mode, and manage their submitted information.
After you provide a teammate’s name and email address, JotForm will send that person an email with instructions to access the form. While your teammates fill out the form, you can receive notifications about incoming submissions, view the data, and act quickly on the information you receive.
JotForm Mobile Forms also makes it easy for users to share forms or individual submissions. They can send forms via text message, in an email, and through social media channels, such as WhatsApp, Facebook, and Twitter.
JotForm users with a starter, or free, plan can create up to five forms with an unlimited number of questions and receive up to 100 submissions each month. They also get up to 100 MB of space for uploaded files and can store up to 500 submissions in their account at any given time.
Smartsheet counts more than 75 percent of Fortune 500 companies as users and estimates that more than 15,000 new projects and processes are created on the interactive project management platform each day.
The broad suite of features includes data visualization tools, centralized work portals, dashboard reports, work tracking boards, and automated request or approval workflows.
Smartsheet also has a mobile app, available in eight languages, that allows registered users and team members to collaborate by accessing, viewing, editing, or updating sheets from their mobile device. Users can also sort information, respond to update requests, view project timelines, make decisions on relayed approvals, and pull up reports on key metrics that are updated in real time.
The mobile app allows users and licensed collaborators to access and fill out a form attached to a sheet. Form responders can even scan a barcode, share location information, or upload files, including photos taken from their mobile device. Once a form is submitted, this information is used to populate rows in an associated sheet.
According to Smartsheet, collecting data with forms is particularly helpful for employees, such as field inspectors, who travel or work from a number of sites and need to access information.
A paid subscription is required to use Smartsheet’s features and its mobile app after a free 30-day trial period. Once the trial period is over, any content created during that time can be viewed for 90 days. This data will be deleted if you don’t sign up for a subscription during the 90-day window.
Device Magic helps workers collect and share information on the go.
To get started, Device Magic users build forms on their laptops or desktops. A drag-and-drop tool allows them to select questions from a toolbar, place them on a blank workspace, and rearrange the order of questions.
Team members added to a company’s Device Magic account can access forms through the app on their mobile devices. Permissions settings give users the flexibility to determine which devices or fellow users can access a form. Users can add devices to a Device Magic account by sending a text or email invitation to new team members.
Device Magic also has offline capabilities, so users don’t need an internet connection to access, fill out, and submit forms through the app. In these cases, submitted form information is saved on a mobile device and transmitted once an internet connection has been reestablished.
There are several advanced form fields, or questions, that allow team members to collect an electronic signature, scan a barcode, attach files, include a photo, and submit annotations to an uploaded image.
Enterprise users can even use Device Magic’s Dispatch feature to have designated team members complete forms with prepopulated information or send a completed form to specific people once it has been submitted.
This workflow can come in handy when remote workers need to share important information —such as job quotes, invoices, purchase orders, or inspection reports — and quickly get approval from managers.
Device Magic offers a 14-day free trial of its Enterprise plan, after which users can sign up for a paid plan to access many key features. Users who do not upgrade after the trial period will be downgraded to a free account.
There are no limits on the number of forms users with free accounts can create or the number of submissions they can receive. However, they can only connect one device per account, which means that they cannot assign or share forms across devices until they upgrade to a paid plan.
With Zoho Forms, registered users can build customizable forms, collect a broad range of information across devices, and manage where that data goes and how it’s used.
That same range of functionality extends to the Zoho Forms app, which allows registered users to build forms from scratch or get a jumpstart on the process by using dozens of templates. Themes used to design a form, however, don’t appear in the app, and users must have a paid plan to customize the design of their form.
The form builder offers a wide range of form fields and many types of questions, so users have more control over what data they collect and how they ask for that information. The electronic signature field, for example, allows people to sign a form on any device with a finger, stylus, trackpad, or mouse.
Users can also modify the properties of individual questions, so they can determine how information is gathered and how questions will appear on a form.
For instance, users can enable form respondents to scan barcodes, QR codes, and business cards with the camera on their mobile device. That data is then used to fill in specific information on a form, such as names, addresses, email addresses, phone numbers, product tags, or offer codes.
As another example, activating the app’s geolocation feature for a specific form allows users to determine where someone filled out and submitted information.
The Zoho Forms app allows users to publish their form on a website, share it on social media, or distribute it as a QR code or public URL. Forms, as well as any submitted form information, can be shared as tasks with other coworkers on the Zoho Forms account.
The kiosk mode feature allows a form to be displayed in full-screen view within the Zoho Forms app. Multiple people can fill out a form on the mobile device but can’t access other parts of the app, such as submitted information or other forms. Users just set a passcode to ensure form respondents don’t gain full access to the app, and then use that passcode to close kiosk mode.
The app allows users to append comments to forms for additional follow-up or background information.
The Zoho Forms app also works offline, so users don’t need an internet connection to build forms, fill them out, and submit information. The data is saved and will be transmitted or uploaded once a mobile device is reconnected to the internet.
Users on free plans can create up to three forms, receive up to 500 submissions a month, upload up to 200 MB of files, and collect up to 10 payments through their forms using Zoho’s integrations with payment processors, including Stripe and PayPal. With a basic plan, at $8 per month, users may receive up to 10,000 submissions each month and store up to 500 MB of files.
Spatial Networks, which leverages geospatial data to develop innovative solutions, launched Fulcrum for companies that rely on field-workers to collect and share information.
More than 110,000 users in 180 countries have used the data collection tool to collect upwards of 174 million records and add more than 55 million geotags to uploaded photos.
Fulcrum users must build a form on their desktop or laptop. They can either use a form builder to drag and drop the questions that they want to ask onto a blank space or select from a variety of templates by industry, such as agriculture, archaeology, business services, construction, environmental services, and emergency management.
After a form has been created, it’s synced automatically to a user’s account and the Fulcrum app. Forms can be assigned to designated team members who can then fill in information, upload photos, capture electronic signatures, and scan barcodes or QR codes. Form respondents can also share geotagged photos with annotations, videos, and audio files that can be viewed on a mobile device, desktop, or laptop.
Because these uploaded files contain geographic and location data, users can track this information on different types of maps offered by Fulcrum or other offline sources, such as Google Street or OpenStreetMap.
Fulcrum’s offline capabilities allow users to view forms, fill them out, and submit information even when a stable internet connection isn’t available. The data is stored on a form respondent’s mobile device and transmitted once a cell phone signal or Wi-Fi connection is found.
During the 14-day free trial of Fulcrum’s services and app, businesses can add up to 50 users to a single plan, export up to 100 records, create no more than three public URLs to share data, and store up to 3 GB of media files.
Once the trial period is over, users can choose to upgrade to a paid plan, starting at $28 per user/month, or $22 per user/month when billed annually.
Magpi’s mobile app, Magpi+, seeks to help nonprofits and nongovernmental organizations lower data collection costs that were traditionally spent on programmers and tech consultants.
Magpi users include well-known nonprofits, heath organizations, and academic institutions, including UNICEF, Mercy Corps, the World Bank, and The Centers for Disease Control and Prevention.
Before forms can be deployed on mobile devices, Magpi users must build them on a web browser from their desktop or laptop. After logging into their Magpi account, users select the form fields that they’d like to use before formatting them and writing the desired text for all questions.
Magpi also allows users to create and fill out forms using a wide variety of languages and alphabets as long as their computers are configured properly and can read the information.
Once Magpi users create their forms, they can assign them to team members who have access to the account and the Magpi+ app downloaded on their mobile devices.
The app not only lets these team members access, fill out, and submit forms but also allows them to view submitted information as an ordered list, thumbnails based on shared images, or a map based on any geolocation data provided. Since any submitted form information is updated automatically in Magpi, teams can access this data regardless of whether they’re using a laptop, desktop, or mobile device to access their account.
Since Magpi was developed for field-workers, the Magpi+ app has offline capabilities that allow teams to collect and submit information in areas that have little or no access to the internet. In these cases, submitted form information is saved on a device and transmitted once a stable internet connection is established.
Magpi users with a free plan can create up to 20 forms, include up to 100 questions on a form, receive up to 500 submissions a month, and add up to 15 people to an account.
Conclusion
JotForm Mobile Forms stands out in the crowd as an all-in-one, end-to-end solution.
For starters, JotForm Mobile Forms allows you to build customized forms on your mobile device while you’re connected to the internet and need to work on the go. You can also give your forms a professional and polished touch on your smartphone or tablet.
This means you no longer have to fret over siloed solutions that require you to build a form on your desktop or laptop before accessing it on a mobile device.
Assigning forms to teammates and tracking their actions can be frustrating in many mobile data collection apps because you must either add them to your account or upgrade from a free plan.
With JotForm Mobile Forms, you can assign forms to colleagues who don’t have a JotForm account, even if you have a free, starter plan.
Some software companies will let you test out a mobile data collection app for a while but require you to buy a subscription once your trial period is over.
JotForm Mobile Forms, however, is free to use for anyone who has a JotForm account, even if you have a free, starter plan. There isn’t even a limit on the features that you can use for free, including the option to scan QR codes or collect electronic signatures. There’s no strings attached or fine print to read, so what you see is what you get.
So, what are you waiting for?
Give JotForm Mobile Forms a try today and see how the mobile data collection app is redefining the way people think about forms.
Today we’re gonna talk about application bundlers — tools that simplify our lives as developers. At their core, bundlers pick your code from multiple files and put everything all together in one or more files in a logical order that are compiled and ready for use in a browser. Moreover, through different plugins and loaders, you can uglify the code, bundle up other kinds of assets (like CSS and images), use preprocessors, code-splitting, etc. They manage the development workflow.
There are lots of bundlers out there, like Browserify and webpack. While those are great options, I personally find them difficult to set up. Where do you start? This is especially true for beginners, where a “configuration file” might be a little scary.
That’s why I tend to reach for Parcel. I stumbled upon it accidentally while watching a tutorial on YouTube. The speaker was talking about tips for faster development and he heavily relied on Parcel as part of his workflow. I decided to give it a try myself.
What makes Parcel special
The thing I love the most about this bundler: it doesn’t need any configuration. Literally, none at all! Compare that to webpack where configuration can be strewn across several files all containing tons of code… that you may have picked up from other people’s configurations or inherited from other projects. Sure, configuration is only as complex as you make it, but even a modest workflow requires a set of plugins and options.
We all use different tools to simplify our job. There are things like preprocessors, post-processors, compilers, transpilers, etc. It takes time to set these up, and often a pretty decent amount of it. Wouldn’t you rather spend that time developing?
That’s why Parcel seems a good solution. Want to write your styles in SCSS or LESS? Do it! Want to use the latest JavaScript syntax? Included. Need a server for development? You got it. That’s barely scratching the surface of the large list of other features it supports.
Parcel allows you to simply start developing. That’s the biggest advantage of using it as a bundler — alongside its blazing fast compiling that utilizes multicore processing where other bundlers, including webpack, work off of complex and heavy transforms.
Where using Parcel makes sense
Parcel, like any tool, is not a golden pill that’s designed as a one-size-fits-all solution for everything. It has use cases where it shines most.
I’ve already mentioned how fast it is to get a project up and running. That makes it ideal when working with tight deadlines and prototypes, where time is precious and the goal is to get in the browser as quickly as possible.
That’s not to say it isn’t up to the task of handling complex applications or projects where lots of developers might be touching code. It’s very capable of that. However, I realize that those projects may very well benefit from a hand-rolled workflow.
It’s sort of like the difference between driving a car with an automatic transmission versus a stick shift. Sometimes you need the additional control and sometimes you don’t.
I’ve been working on a commercial multi-page website with a bunch of JavaScript under the hood, and Parcel is working out very well for me. It’s providing my server, it compiles my Sass to CSS, it adds vendor prefixes when needed, and it allows me to use import and export in my JavaScript files out of the box without any configuration. All of this allowed me to get my project up and running with ease.
Let’s create a simple site together using Parcel
Let’s take Parcel for a test drive to see how relatively simple it is to make something with it.
We’re going to build a simple page that uses Sass and a bit of JavaScript. We’ll fetch the current day of the week and a random image from Unsplash Source.
The basic structure
There’s no scaffolding we’re required to use or framework needed to initialize our project. Instead, we’re going to make three files that ought to look super familiar: index.html, style.scss and index.js. You can set that up manually or in Terminal:
You may have noticed that I’m pulling in a web font (Lato) from Google, which is totally optional. Otherwise, all we’re doing is linking up the CSS and JavaScript files and dropping in the basic HTML that will display the day of the week and a link from Unsplash that will serve a random image. This is all we really need for our baseline.
Marvel at Parcel’s quick set up!
Let’s run the application using with Parcel as the bundler before we get into styling and scripts. Installing Parcel is like any thing:
npm install -g parcel-bundler
# or
yarn global add parcel-bundler
Let’s also create a package.json file should we need any development dependencies. This is also where Parcel will include anything it needs to work out of the box.
npm init -y
# or
yarn init -y
That’s it! No more configuration! We only need to tell Parcel which file is the entry point for the project so it knows where to point its server. That’s going to be our HTML file:
parcel index.html
If we open the console we’ll see something like this indicating that the server is already running:
Server running at http://localhost:1234
Parcel’s server supports hot reloading and rebuilds the app as change are saved.
Now, heading back to our project folder, we’ll see additional stuff,that Parcel created for us:
What’s important for us here is the dist folder, which contains all our compiled code, including source maps for CSS and JavaScript.
Now all we do is build!
Let’s go to style.scss and see how Parcel handles Sass. I’ve created variables to store some colors and a width for the container that holds our content:
Once we save, Parcel’s magic is triggered and everything compiles and reloads in the browser for us. No command needed because it’s already watching the files for changes.
This is what we’ve got so far:
The only thing left is to show the current day of the week. We’re going to use imports and exports so we get to see how Parcel allows us to use modern JavaScript.
Let’s create a file called today.js and include a function that reports the current day of the week from an array of days:
? It’s worth a note to remember that the getDay function returns Sunday as the first day of the week.
Notice we’re exporting the getDay function. Let’s go into our index.js file and import it there so it gets included when compiling happens:
import { getDay } from './today';
We can import/export files, because Parcelsupports ES6 module syntax right out of the box — again, no configuration needed!
The only thing left is to select the element and pass the value of the getDay function to it:
const day = document.querySelector('.today');
day.innerHTML = getDay();
Let’s see the final result:
Last thing is to build for production
We’ve created the app, but we want to serve it somewhere — whether it’s your personal server or some zero-configuration deployment like Surge or Now — and we want to serve compiled and minified code.
Here’s the one and only command we need:
parcel build index.js
This gives us all of the production-ready assets for the app. You can read more aboutParcel’s product mode for some tips and tricks to get the most from your environment.
I’ve said it several times and I’ll say it again: Parcel is a great tool. It bundles, it compiles, it serves, it pre- and post-processes, it minifies and uglifies, and more. We may have looked at a pretty simple example, but hopefully you now have a decent feel for what Parcel offers and how you might start to use it in your own projects.
I’m interested if you’re already using Parcel and, if so, how you’ve been using it. Have you found it works better for some things more than others? Did you discover some neat trick that makes it even more powerful? Let me know in the comments!
The team at Stripe explores how they’re refining their color palette to make it more accessible and legible for users across all their products and interfaces. Not only that but the team built a wonderful and yet entirely bonkers app for figuring out the ideal range of colors that they needed.
We built a web interface to allow us to visualize and manipulate our color system using perceptually uniform color models. The tool gave us an immediate feedback loop while we were iterating on our colors—we could see the effect of every change.
This tool is…whoa! I would love to learn a bit more about why they built this though as it looks like it wouldn’t have been a particularly quick and easy thing to put together. I wonder if that team has to support a wide-range of colors for their charts or data-visualization UI (as complex charts can often require a much larger range of colors for comparing bits of data effectively). Either way, this is pretty inspiring work.
This somewhat harkens to a couple of techniques for enforcing accessible colors, including one that uses custom properties with calc() and rgb by Josh Bader, and another by Facundo Corradini that also uses custom properties but with hsl with conditional statements.
Whether it’s in a novel, the latest box office smash, or when Uncle Elmer mistook a potted cactus for a stress ball, we all love stories. There are stories we love, stories we hate, and stories we wish we’d never experienced. Most of the good stories share structure and principles that can help us create consistent website experiences. Experiences that speak to user expectations and guide them to engage with our sites in a way that benefits both of us.
In this article, we’ll pull out and discuss just a few examples of how thinking about your users’ stories can increase user engagement and satisfaction. We’ll look at deus ex machina, ensemble stories, consistency, and cognitive dissonance, all of which center on audience expectations and how your site is meeting those expectations or not.
We can define a story as the process of solving a problem. Heroes have an issue, and they set out on a quest to solve it. Sometimes that’s epic and expansive like the Lord of the Rings or Star Wars and sometimes it’s small and intimate such as Driving Miss Daisy or Rear Window. At its core, every story is about heroes who have a problem and what they do to solve it. So too are visits to a website.
The user is the hero, coming to your site because they have a problem. They need to buy a tchotchke, hire an agency or find the video game news they like. Your site can solve that problem and thus play an important role in the user’s story.
Deus Ex Machina
It’s a term meaning “god from the machine” that goes back to Greek plays — even though it’s Latin — when a large, movable scaffolding or “machine” would bring out an actor playing a god. In the context of story, it’s often used to describe something that comes out of nowhere to solve a problem. It’s like Zeus showing up at the end of a play and killing the villain. It’s not satisfying to the audience. They’ve watched the tension grow between the hero and the villain and feel cheated when Zeus releases the dramatic tension without solving that tension. They watched a journey that didn’t matter because the character they loved did not affect the ending.
The danger of deus ex machina is most visible in content marketing. You hook the audience with content that’s interesting and applicable but then bring your product/site/whatever in out of nowhere and drop the mic like you won a rap battle. The audience won’t believe your conclusion because you didn’t journey with them to find the solution.
If, however, the author integrates Zeus into the story from the beginning, Zeus will be part of the story and not a convenient plot device. Your solutions must honor the story that’s come before, the problem and the pain your users have experienced. You can then speak to how your product/site/whatever solves that problem and heals that pain.
State Farm recently launched a “Don’t Mess With My Discount!” campaign:
Kim comes in to talk to a State Farm rep who asks about a Drive Safe and Save discount. First, for the sake of the discount, Kim won’t speed up to make a meeting. Next, she makes herself and her child hold it till they can get home driving the speed limit. Last, in the midst of labor, she won’t let her partner speed up to get them to the hospital. (Don’t mess with a pregnant lady or her discount.) Lastly, it cuts back to Kim and the agent.
State Farm’s branding and their signature red color are strong presences in both bookend scenes with the State Farm representative. By the end, when they give you details about their “Drive Safe and Save” discount you know who State Farm is, how they can help you, and what you need to do to get the discount.
Throughout the ad, we know State Farm’s motivations and don’t feel duped into liking something whose only goal is to separate us from our money. They set the expectation of this story being an ad in the beginning and support that throughout.
Another Approach
Sometimes putting your name upfront in the piece might feel wrong or too self-serving. Another way to get at this is to acknowledge the user’s struggle, the pain the user or customer already feels. If your site doesn’t acknowledge that struggle, then your product/site/whatever seems detached from their reality, a deus ex machina. But if your content recognizes the struggle they’ve been through and how your site can solve their problem, the pitch for deeper engagement with your site will be a natural progression of the user’s story. It will be the answer they’ve been searching for all along.
Take this testimonial from Bizzabo:
It shows the user where Greenbook was, i.e. mired in tedious tasks, and how Bizzabo helped them get past tedium to do what Greenbook says they do best: make memorable experiences. Bizzabo doesn’t come out of the woodwork to say “I’m awesome” or solve a problem you never had. They have someone attesting to how Bizzabo solved a real problem that this real customer needed to be fixed. If you’re in the market to solve that problem too, Bizzabo might be the place to look.
Ensemble Stories
Some experiences, like some stories, aren’t about a single person. They’re about multiple people. If the story doesn’t give enough attention to each member, that person won’t seem important or like a necessary part of the story. If that person has a role in the ending, we feel cheated or think it’s a deus ex machina event. If any character is left out of a story, it should change the story. It’s the same way with websites. The user is the story’s hero, but she’s rarely the only character. If we ignore the other characters, they won’t feel needed or be interested in our websites.
Sometimes a decision involves multiple people because a single user doesn’t have the authority to decide. For instance, Drupalcon Seattle 2019 has a “Convince Your Boss” page. They showcase the benefits of the conference and provide materials to help you get your boss to agree to send you.
You could also offer a friends-and-family discount that rewards both the sharer and the sharee. (Yes, as of this moment, “sharee” is now a word.) Dropbox does this with their sharing program. If you share their service with someone else and they create an account, you get additional storage space.
But you don’t have to be that explicit about targeting other audiences than the user themselves. In social networks and communities, the audience is both the user and their friends. The site won’t reach a critical mass if you don’t appeal to both. I believe Facebook beat MySpace early on by focusing on the connection between users and thus serving both the user and their friends. MySpace focused on individual expression. To put it another way, Facebook included the user’s friends in their audience while MySpace didn’t.
Serving Diametrically Opposed Heros
Many sites that run on ad revenue also have to think about multiple audiences, both the users they serve and the advertisers who want to reach those users. They are equally important in the story, even if their goals are sometimes at odds. If you push one of these audiences to the side, they’ll feel like they don’t matter. When all you care about is ad revenue, users will flee because you’re not speaking to their story any longer or giving them a good experience. If advertisers can’t get good access to the user then they won’t want to pay you for ads and revenue drops off.
Just about any small market newspaper website will show you what happens when you focus only on advertisers’ desires. Newspaper revenue streams have gone so low they have to push ads hard to stay alive. Take, for instance, the major newspaper from my home state of Delaware, the News Journal. The page skips and stutters as ad content loads. Click on any story and you’ll find a short article surrounded by block after block after block of ad content. Ads are paying the bills but with this kind of user experience, I fear it won’t be for long.
Let me be clear that advertisers and users do not have to be diametrically opposed, it’s just difficult to find a balance that pleases both. Sites often lean towards one or the other and risk tipping the scales too far either way. Including the desires of both audiences in your decisions will help you keep that precarious balance.
One way to do both is to have ads conform to the essence of your website, meaning the thing that makes your site different i.e. the “killer app” or sine qua non of your website. In this way, you get ads that conform to the reason the users are going to the site. Advertisers have to conform to the ad policy, but, if it really hits on the reason users are going to the site, advertisers should get much greater engagement.
On my own site, 8wordstories.com, ads are allowed, but they’re only allowed an image, eight words of copy, and a two-word call to action. Thus when users go to the site to get pithy stories, eight words in length, the advertisements will similarly be pithy and short.
The hero doesn’t train as a medieval knight for the first half of the story and then find herself in space for the second half. That drastic shift can make the audience turn on the story for dashing their expectations. They think you did a bait-and-switch, showing them the medieval story they wanted and then switching to a space story they didn’t want.
If you try to hook users with free pie, but you sell tubas, you will get lots of pie lovers and very few tuba lovers. Worse yet is to have the free pie contingent on buying a tuba. The thing they want comes with a commitment or price tag they don’t. This happens a lot with a free e-book when you have to create an account and fill out a lengthy form. For me, that price has often been too high.
Make sure the way you’re hooking the audience is consistent with what you want them to read, do, or buy. If you sell tubas offer a free tuba lesson or polishing cloth. This’ll ensure they want what you provide and they’ll think of you the next time they need to buy a tuba.
That said, it doesn’t mean you can’t offer free pie, but it shouldn’t get them in the door, it should push them over the edge.
Audible gives you a thirty-day free trial plus an audio book to keep even if you don’t stay past the trial. They’re giving you a taste of the product. When you say, “I want more.” You know where to get it.
While not offering a freebie, Dinnerly (and most of the other bazillion meal kit delivery companies) offers a big discount on your first few orders, encouraging new customers to try them out. This can be an especially good model for products or services that have fixed costs with enticing new customers.
Cognitive Dissonance
There’s another danger concerning consistency, but this one’s more subtle. If you’re reading a medieval story and the author says the “trebuchet launched a rock straight and true, like a spaceship into orbit.” It might be an appropriate allusion for a modern audience, but it’s anachronistic in a medieval story, a cognitive dissonance. Something doesn’t quite make sense or goes against what they know to be true. In the same way, websites that break the flow of their content can alienate their audience without even meaning to (such as statistics that seem unbelievable or are so specific anyone could achieve them).
112% of people reading this article are physically attractive.
(Here’s lookin’ at you, reader.)
This article is the number one choice by physicians in Ohio who drive Yugos.
(Among other questions, why would a European car driving, Ohioan Doctor read a web user experience article?)
These “statistics” break the flow of the website because they make the user stop and wonder about the website’s reputability. Any time a user is pulled out of the flow of a website, they must decide whether to continue with the website or go watch cat videos.
Recently, I reviewed proposals for a website build at my day job. The developers listed in the proposal gave me pause. One with the title “Lead Senior Developer” had seven years of experience. That seemed low for a “lead, senior” developer, but possible. The next guy was just a “web developer” but had twenty years of experience. Even if that’s all correct, their juxtaposition made them look ridiculous. That cognitive dissonance pulled me out of the flow of the proposal and made me question the firm’s abilities.
Similarly poor quality photos, pixelated graphics, unrelated images, tpyos, mispelllings, weird bolding and anything else that sticks out potato can cause cognitive dissonance and tank a proposal or website (or article). The more often you break the spell of the site, the harder it will be for clients/users to believe you/your product/site/thing are as good as you say. Those cat videos will win every time because they always meet the “lolz” expectation.
Conclusion
Users have many expectations when they come to your site. Placing your users in the context of a story helps you understand those expectations and their motivations. You’ll see what they want and expect, but also what they need. Once you know their needs, you can meet those needs. And, if you’ll pardon my sense of humor, you can both …live happily ever after.
All the signs are that web design is entering a phase of exuberance, with clashing colors, rapidly changing graphics, and dense layouts replacing the minimalism that’s dominated digital design for the last decade. Portfolios are beginning to adopt this maximalist approach, but never fear, for those who aren’t quote ready for full-on retina burn on a Monday in late October, we’ve included a few beautifully minimal sites for you to enjoy.
Hello Monday
Hello Monday’s site is utterly charming, with a delightful animation that I could watch for hours. The work section of the site is a masonry-style vertical grid, which is less easy to browse than you would expect, thanks to the number of projects. The best parts of this site are the little details: I love that they tell you how many days it is until Monday, and the way that hamburger menu slips away as you scroll is super-slick.
Bold
Bold’s portfolio is about sending a powerful message. It’s the website equivalent of huge shoulder pads, and an enormous, solid gold smartphone. The way the border expands from the featured images, giving you the sense of zooming into the project is inspired. It helps to have huge-name clients as social proof, but this site is excellent at inspiring confidence in the designers behind it.
Analog is Heavy
Analog is Heavy is a creative photography practice that works with design studios to hone brand messages with high-quality product photography. Its approach to a portfolio is a vertically aligned grid of images, and that’s it. Targeting design agencies means that they’re speaking to an audience of visually educated professionals, giving Analog is Heavy the freedom to let its work sell itself.
Athletics
Another big agency, with a client list to kill for, Athletics jumps right into fullscreen video case studies of its work for clients like IBM. One trend with many of these portfolios is that work is cherry-picked to be showcased and then less-exciting work is linked to below the initial presentation. In Athletics’ case this means an interesting grid of lower-profile, but equally exciting work.
Brittany Chiang
Brittany Chiang builds things for the web. How’s that for a no-nonsense approach? This great little site feels very app-orientated thanks to the dark-mode color palette and the monospaced typeface. Its a single-pager, which are increasingly rare these days, and the simplicity of it works really well. Brittany has out UXed plenty of dedicated UX designers, by being true to herself.
Shohei Takenaka
As the web drifts towards maximalism, it’s great that there are still calm, simple, minimalist masterpieces to admire. Shohei Takenaka’s site is beautiful, with restraint, attention to detail, and ample whitespace. The subtle underlines on the menu text, and the images protruding into the white space to encourage scrolling, as well as the way the color bands are grouped when you scroll, are all perfect examples of clever UI design.
Aristide Benoist
Aristide Benoist’s portfolio features some beautiful typography. It’s great to see a developer take an interest in the finer points of design. The all-caps sans-serif text is a little too much to cope with in large amounts, but here it works just fine. My favourite part of the site is the transition from thumbnail to case study. Hover over the list of projects and a little flag-like ribbon will appear, click on it and it expands into a full project image, delightful!
WTF Studio
WTF Studio’s portfolio is as in-yer-face as the name suggests. A front for NYC-based creative director Able Parris, the site slaps you in the eyes with color and animation the moment it loads. But scroll down past the anarchic introduction and you’ll find a series of projects for household names presented as individual case studies. It’s exactly what big brands like to see: creativity and safe hands.
Jim Schachterle
Jim Schachterle’s site takes an approach that we don’t normally see: he’s opted for a dark green background. That simple choice, alongside the carefully paired project shots make for a sophisticated, and distinct style. Unfortunately the choice of typeface doesn’t work in places, at 12px the detail in the design is lost altogether, swapping it out for a simpler sans-serif whenever the font-size was under 18pt would have been a better choice.
Swwim
Perhaps it’s the chilly Northern climate at this time of year, but this Saint-Tropez looking site for Swwim warms my heart. The rounded sans-serif is an interesting choice — most designers would aim for sharp lines to emphasize precision. I adore the logotype, and its frivolity is echoed throughout the site in section titles. The less-subtle animation feels a little forced, but the wave motion is enticing, and brand-appropriate.
Hadrien Mongouachon
Hadrien Mongouachon is a freelance developer, so it makes perfect sense for him to demo his skills front and center on his site. He’s opted for a variation of the highly-trendy liquid effect, and it works really well. I’m not convinced by the sideways type — it only works in print because you can tilt the page — and the usability is a little compromised by the click-hold action. Once you’re accustomed to the site, it’s fun to traverse.
Butchershop
Butchershop is another design agency relying heavily on a video reel to sell its brand work. What’s really interesting about this site, is all the things it does “wrong”: the logo mark is positioned top right instead of top left, the title of its homepage is “Home”. It keeps breaking with received wisdom, so either they know something we don’t, or they didn’t get the memo about UX being a thing — you decide which.
Nikolas Type
It’s rare that we get to enjoy a purely type-based portfolio, because design work is visual, but this minimal showcase is Nikolas Wrobel’s Type Foundry, Nikolas Type. Click through to the product pages and you can edit the preview text. Thanks to the foundry being a small independent, it’s able to show some lovely samples that bring the type to life, something that larger foundries often fail to do.
Jam3
It seems video (not static images) are now a must for any portfolio site. Agencies want companies to see real-world experiences, and understand what the working relationship is like. Jam3 is no exception, but scroll past the looping video and you’ll find a rigorously organized set of projects. The menu isn’t easy to locate, but I do like agencies opening up about their approach, and culture. Plus there’s a cool bubble effect hovering over the menu items.
New Land
There’s a tendency among motion graphics and video firms to be slightly mysterious about who they are, and what they do — perhaps it comes from the high-concepts of advertising. New Land‘s target audience probably do know who it is, because this is the kind of company that you don’t hire without some prior-knowledge. Interestingly the site is geared around tablet and mobile preferred interactions, as if intended to be passed around a meeting.
If you are in the field of Marketing either as a marketer, a designer, or a developer, you must have heard of Marketing Automation.
This term has exploded in popularity since 2016 and is one of the hot topics among all types of businesses from startups to enterprise retail & eCommerce companies.
No matter which industry your business is in, marketing automation has become an essential marketing capability to have. Why? Here are some mind-blowing marketing automation stats to support our claims.
What is Marketing Automation?
Marketing Automation refers to the coordination and automation of marketing processes across one or more channels (email, push notifications, SMS, social media) to create engaging interactions with your customers with the goal of maximizing the chances of conversion or retention.
So what are the key benefits of Marketing Automation?
If your company uses a marketing automation software then you will gradually:
Improved engagement rate: Marketing Automation will deliver more relevant content to your audience, thus this will lead for sure to more customer interactions, thus improved customer experience.
Improved campaign stats: Email Open rate, Email Click-through rate,
Improved ROI (Return On Investment): The natural flow of delivering more relevant content is the maximization of conversions.
Improved Customer Retention: Since your automation flows will be behavioral driven, people will receive more compelling offers to their inbox/screens. As a result, they will return to your site more frequently, at least.
(eCommerce-only) Improved AOV (Average Order Value): Dynamic crossing and upselling automation can guarantee an increased AOV, as a result of “pitching” better content to your subscribers.
Improved CLV (Customer Lifetime Value): At the end of the day if all the previous occur, you will gradually realize that marketing automation has led your average customer lifetime value to higher standards.
3 Great examples of companies nailing marketing automation
Over 50% of the world use marketing automation in their mix. Some of the top businesses in the world who adopted marketing automation as a core capability are:
Amazon: Amazon’s amazing dynamic product recommendation engine is a classical “best practice” in the eCommerce industry during the last years. Their marketing automation strategy delivers the most personalized experience and maximizes the profits, via
Hubspot: Famous for their inbound marketing strategy generally and specifically for their email automation sequences that nurture and convert leads to customers. The result?
Buzzfeed: They are renowned for their hyper segmented email sequences, with their educational email workflows making them famous worldwide among the marketing professionals.
The 10 Marketing automation flows that will totally blow your mind
Here are the 10 most impressive marketing automation stats to know:
Marketing Automation sales worldwide were $13 billion US dollars (Statista, 2019)
They (marketing automation sales) are expected to exceed $25 billion US dollars by 2023 (Statista, 2019)
On average, 49% of companies are currently using marketing automation, with more than half of B2B companies (55%) adopting the technology. (Adobe, 2018)
Marketing automation drives a 14.5% increase in sales productivity and a 12.2% reduction in marketing overhead. (Nucleus Research, 2017)
77% of companies using marketing automation saw an increase in their conversions (VB Insight)
82% of people said that they would buy more products online via email if the emails they received were personalized around their preferences. (Maildocker, 2016)
Price drop reminder open rates average at 38.33% (ContactPigeon, 2018)
Triggered emails have a click-through rate of 152% higher than traditional emails. (Email Monks, 2018)
Cart abandonment emails have an average open rate of 30% (Marketing Hub, 2018)
The average click-through rate of a transactional email is ~300% better than the regular promotional email (Smart Insights, 2018)
Why are these statistics important?
The size of the (marketing automation) market has been growing year after year. Over 50% of companies worldwide have adopted this technology with proven results. At this point, having solid marketing automation functionalities within your marketing stack is no longer a “good to have” feature but rather a “competitive necessity” to stay ahead of the market.
4 Top marketing automation stats for Black Friday and eCommerce
As all of the marketers within retail and eCommerce are well aware, Black Friday weekend is the biggest shopping days of the year, both online and offline. At ContactPigeon, we run an annual assessment of Black Friday marketing automation metrics across all of our retailers. So, here are the top marketing automation stats we gathered :
7% of all Black Friday eCommerce sales came from marketing automation
…and let’s break that down to:
4% of all Black Friday eCommerce sales came from behavioral-triggered pop-ups
2% of all Black Friday eCommerce sales came from automated push notifications
1% of all Black Friday eCommerce sales came from email automation
Why are these statistics important?
These stats show that the natural evolution of marketing automation is the omnichannel marketing automation. The brands which can develop the best ability in building cross-device and cross-channel marketing strategies will be able to win a bigger market share.
Not a trend. A necessity.
Marketing automation used to be a hot trend a few years ago. This era has come to an end since half of the companies globally have already adopted or in the process of adopting it.
Marketing automation has evolved from a trend into a necessity and the above-mentioned marketing automation ROI stats demonstrate two points.
First, if you are among the early adopters then you should probably focus on building an omnichannel marketing automation strategy. If you are still debating whether to adopt marketing automation, then your business may already be late to the game. The key, therefore, is to accelerate the process by seeking a well-established marketing automation solution that can be customized to fit your industry niche and internal processes.
If you are in the field of Marketing either as a marketer, a designer, or a developer, you must have heard of Marketing Automation.
This term has exploded in popularity since 2016 and is one of the hot topics among all types of businesses from startups to enterprise retail & eCommerce companies.
No matter which industry your business is in, marketing automation has become an essential marketing capability to have. Why? Here are some mind-blowing marketing automation stats to support our claims.
What is Marketing Automation?
Marketing Automation refers to the coordination and automation of marketing processes across one or more channels (email, push notifications, SMS, social media) to create engaging interactions with your customers with the goal of maximizing the chances of conversion or retention.
So what are the key benefits of Marketing Automation?
If your company uses a marketing automation software then you will gradually:
Improved engagement rate: Marketing Automation will deliver more relevant content to your audience, thus this will lead for sure to more customer interactions, thus improved customer experience.
Improved campaign stats: Email Open rate, Email Click-through rate,
Improved ROI (Return On Investment): The natural flow of delivering more relevant content is the maximization of conversions.
Improved Customer Retention: Since your automation flows will be behavioral driven, people will receive more compelling offers to their inbox/screens. As a result, they will return to your site more frequently, at least.
(eCommerce-only) Improved AOV (Average Order Value): Dynamic crossing and upselling automation can guarantee an increased AOV, as a result of “pitching” better content to your subscribers.
Improved CLV (Customer Lifetime Value): At the end of the day if all the previous occur, you will gradually realize that marketing automation has led your average customer lifetime value to higher standards.
3 Great examples of companies nailing marketing automation
Over 50% of the world use marketing automation in their mix. Some of the top businesses in the world who adopted marketing automation as a core capability are:
Amazon: Amazon’s amazing dynamic product recommendation engine is a classical “best practice” in the eCommerce industry during the last years. Their marketing automation strategy delivers the most personalized experience and maximizes the profits, via
Hubspot: Famous for their inbound marketing strategy generally and specifically for their email automation sequences that nurture and convert leads to customers. The result?
Buzzfeed: They are renowned for their hyper segmented email sequences, with their educational email workflows making them famous worldwide among the marketing professionals.
The 10 Marketing automation flows that will totally blow your mind
Here are the 10 most impressive marketing automation stats to know:
Marketing Automation sales worldwide were $13 billion US dollars (Statista, 2019)
They (marketing automation sales) are expected to exceed $25 billion US dollars by 2023 (Statista, 2019)
On average, 49% of companies are currently using marketing automation, with more than half of B2B companies (55%) adopting the technology. (Adobe, 2018)
Marketing automation drives a 14.5% increase in sales productivity and a 12.2% reduction in marketing overhead. (Nucleus Research, 2017)
77% of companies using marketing automation saw an increase in their conversions (VB Insight)
82% of people said that they would buy more products online via email if the emails they received were personalized around their preferences. (Maildocker, 2016)
Price drop reminder open rates average at 38.33% (ContactPigeon, 2018)
Triggered emails have a click-through rate of 152% higher than traditional emails. (Email Monks, 2018)
Cart abandonment emails have an average open rate of 30% (Marketing Hub, 2018)
The average click-through rate of a transactional email is ~300% better than the regular promotional email (Smart Insights, 2018)
Why are these statistics important?
The size of the (marketing automation) market has been growing year after year. Over 50% of companies worldwide have adopted this technology with proven results. At this point, having solid marketing automation functionalities within your marketing stack is no longer a “good to have” feature but rather a “competitive necessity” to stay ahead of the market.
4 Top marketing automation stats for Black Friday and eCommerce
As all of the marketers within retail and eCommerce are well aware, Black Friday weekend is the biggest shopping days of the year, both online and offline. At ContactPigeon, we run an annual assessment of Black Friday marketing automation metrics across all of our retailers. So, here are the top marketing automation stats we gathered :
7% of all Black Friday eCommerce sales came from marketing automation
…and let’s break that down to:
4% of all Black Friday eCommerce sales came from behavioral-triggered pop-ups
2% of all Black Friday eCommerce sales came from automated push notifications
1% of all Black Friday eCommerce sales came from email automation
Why are these statistics important?
These stats show that the natural evolution of marketing automation is the omnichannel marketing automation. The brands which can develop the best ability in building cross-device and cross-channel marketing strategies will be able to win a bigger market share.
Not a trend. A necessity.
Marketing automation used to be a hot trend a few years ago. This era has come to an end since half of the companies globally have already adopted or in the process of adopting it.
Marketing automation has evolved from a trend into a necessity and the above-mentioned marketing automation ROI stats demonstrate two points.
First, if you are among the early adopters then you should probably focus on building an omnichannel marketing automation strategy. If you are still debating whether to adopt marketing automation, then your business may already be late to the game. The key, therefore, is to accelerate the process by seeking a well-established marketing automation solution that can be customized to fit your industry niche and internal processes.