At some point, most businesses conduct a redesign project of a website. A bad design will cause people to abandon a website. Nearly half of the people who reject a website say website design is a leading factor. It shapes the opinion of the business’s credibility. There are many reasons to redesign a website, like fixing SEO errors, and some of the others are:
- Revamping the look
- Enhance legitimacy
- Introduce essential new functionality for website user experience
- Branding
- SEO value
They can have severe and often unexpected side effects that wreck Search Engine Optimization. Search engine results tumble down in ranking if the redesign process does not incorporate SEO.
Not all redesigns increase traffic or elevate your rankings on the Google search console. Retaining your Search Engine Optimization and organic traffic must be a significant consideration in the website redesign process. It should be part of the plan, implementation, and post-launch assessment.
The Meaning of Website Redesign For Fixing SEO Errors
Website redesign changes the functions and features, fixing SEO errors to improve its appearance and usability. Some of the objectives are:
- A boast in conversion rates and sales
- A reduction in bounce rates
- Improving the user experience
- Highlighting new branding
- SEO value enhancement
- Revitalization of unattractive and old-fashioned web design
- Increase page speed loading
- Simplify site navigation
- Gain an edge over competitors
- Attract new traffic and reach new audiences.
Impact of Website Redesign on Search Engine Optimization
It is not unusual for websites to lose traffic and see lower rankings after redesigning. All redesigns are not bad for Search Engine Optimization. If carried out correctly, SEO value does not take a hit.
Do not panic if there is a slight dip in traffic immediately after a website redesign goes live. The reason may be that search engines have to crawl the revamped website before they serve the new version to users.
It often means a temporary decrease in search traffic. When the search engines work out the new site’s structure, business, as usual, will likely return quickly. Some websites find their web traffic takes a permanent, sharp dive or a decreased traffic level and low ranking that persists for weeks after a redesign.
That turn of events is cause for concern and warrants a close inspection to determine the problem. One of the most common reasons for the web traffic decrease is serious SEO mistakes committed during the redesign, like unwittingly failing to set up a proper redirect structure or SEO being an afterthought.
Losing organic traffic and SEO position is time-consuming and costly and may be responsible for a business going under if it is not swiftly remediated. Regaining a valuable ranking position can be challenging, if at all possible.
Some businesses need the help of an SEO consultant, like ADA Site Compliance, to reverse the damage that is not always quickly and easily repaired. Each lost visitor that does not find a site through search engine results from a search means the loss of potential conversion.
SEO drives 300 percent more traffic to a site than 70 percent of the clicks that go to search engine result pages and social media. You cannot underestimate the impact of SEO on a business’s bottom line. Nearly half of Google search console users discover new products there. For that reason, the website redesign process must keep SEO considerations at the forefront.
SEO Errors & Mistakes To Avoid on Website Redesign
Being familiar with these standard website redesign errors helps minimize the risk of lost organic traffic and a drop in rankings on the entire site.
1. Winning Content
It may be tempting to start with new brand messaging and tone, but it is wise not to try to fix what is not broken. Do not scrap or revise high-ranking content that features priority keywords.
Performing content is responsible for featured snippets, inbound links, and traffic you do not want to lose. It could take quite some time to rebuild. Search engines want to offer users the best user experience possible.
In part, that involves matching searches to search engine result pages. When the content on a page changes, the search engines reassess the relevant information for the targeted keyword. A page with drastically different information than before for the keyword or may no longer exist may cause the search engines to determine it provides no valuable user experience for the keyword and demotes the ranking.
A demotion in ranking is problematic for highly competitive and business-critical keywords in an industry. A page previously ranked high for a specific keyword search may redirect the user to your home page without reference to the searched keyword and may not satisfy the user’s intent.
A helpful hint is to check the website analytics to determine the most popular pages with the lowest bounce rates and high conversions. Avoid substantially altering or removing the content to retain the rankings after the redesign. That is not to say you should not revisit those pages for stale or outdated content but make minimal changes. Retain the core keywords and H1 and H2 tags.
2. Page Speed
A significant signal in search engine rankings is page loading speed. It can cause visitors to abandon a website. As many as 93 percent of visitors leave a website if it does not load quickly. Research has shown that increasing page speed by one-second increases conversions by seven percent and website visitors.
A redesigned website may be slowed down by retaining large amounts of unoptimized, unnecessary code that needs to be deleted or adding uncompressed, large images that do not load quickly.
Google introduced Core Web Vitals, user-centric metrics used to measure and quantify the user experience. Fast loading speeds are more important than ever. You cannot ignore this area of SEO when redesigning a website.
A page-load checker is the best means of testing a redesigned website’s speed. The tool helps pinpoint what slows down the revamped site so you can solve the problem before the website goes live.
3. Content Above the Fold
Considering what will be above the fold when a website redesign is completed would be best. The fold is the area at the top of web pages seen without scrolling. It is best to have some unique content above the fold on each page.
Whatever visitors see first is also what search engines take into account. Google explicitly advises site owners to have unique content placed above the fold. It may be tempting to replicate new messaging above the fold, across the top of all pages.
The technique will hurt search rankings. It is best to focus on engaging, unique content management that hints at what is to come and entices users to keep scrolling. Keep non-desktop visitors in mind also. Above the fold, content is twice as important for mobile devices. JavaScript
JavaScript
JavaScript makes websites more engaging and dynamic. When implemented incorrectly, it can cause serious SEO mistakes that get a website blacklisted by Google. JavaScript can be problematic for search engines to crawl and index.
If JavaScript is not ‘seen’ by search engines, they cannot rank it. Website owners have tried to skirt the issue of search engine bots having access to JavaScript content with a technique called ‘cloaking.’ Cloaking shows different content to users than that presented to search engines.
It violates Google Webmaster Guidelines and may lead to penalized search engine rankings for a redesigned website. A website that suddenly plummets in ranking after a redesign may be due to launching a lot of new JavaScript.
4. Redirecting URLs
It creates a bad user experience when they get an unexpected 404 error message. For pages with backlinks, it is especially of concern from the perspective of SEO mistakes. Search engines need to know when a page has moved, where it is, and how to find it.
The page is considered lost if they do not, and visitors receive a 404 notice. A 301 redirect passes the former SEO ranking power to the new location without working from scratch.
Do not redirect all vanquished pages to your home page. Search engines want to offer users the most relevant information for a search query. If a site sends them to a home page that does not address the request, a corresponding drop in traffic may occur because the result does not serve the user.
Direct traffic to a relevant page instead of forcing visitors to find it themselves. Redirecting URLs is exceptionally time-sensitive, especially for large sites. When mapping valuable backlinks and URLs, use an automated crawler to redirect them before you launch a redesigned website.
It is wise to scan for broken links after the redesign goes live to fix links that may have gone astray or are subject to crawl errors that prevent pages from surfacing in Search Engine Result Pages. Broken links may be responsible for losing a visitor who will not return or convert.
5. Mobile Devices
If you rank in the first position on mobile devices, studies show netting almost one-third of all clicks instead of 19 percent of clicks in that position on desktops. Mobile users are also more likely to stop engaging in a website that is not mobile-friendly.
Mobile friendliness is a critical factor for Google rankings. A website redesign has to work as well for mobile devices as for desktops. A seamless mobile search experience is non-negotiable. Many businesses continue to make the desktop experience a priority. It has a detrimental impact on SEO.
Fast loading speeds and responsive design are essential to mobile-friendliness. If it takes more than three seconds to load, 40 percent of mobile users ditch the site. Mobile design best practices include not requiring visitors to zoom or pinch content to read it.
6. XML Sitemaps
Up-to-date XML sitemaps are valuable tools for SEO rankings. They help search engine crawlers in finding and index a website’s content faster, which tells them how a website is structured and how you should prioritize pages.
A sitemap should not be static. It should be manually updated to reflect the redesigned website. If it is not, Google and other search engines may not be able to comprehensively and quickly crawl the site.
They do not check sitemaps on every crawl. Sitemaps are crawled the first time search engines encounter them or when there is an alert that a sitemap has changed. Immediately after completing a redesign, submit an up-to-date sitemap to Google Search Console and other search engines where the website appears.
Benchmark SEO Performance
Before the redesign, it is critical to benchmark site performance to establish whether the changes implemented will cause a drop in rankings and traffic or if it is an unrelated issue like a search engine algorithm change or getting hacked.
A comprehensive SEO audit is the best way to benchmark site performance. ADA Site Compliance can help evaluate the site’s performance across SEO indicators. Those of most importance are:
- Bounce rates
- Click-through rates
- Conversion rates
- Keyword rankings
- Page speed
- Top-loading pages
The audit links with integrating SEO into the website redesign. SEO should be part of the website redesign process from the start. Taking stock of all content assets and the search performance should be done before anything else.
It is much simpler to bake SEO into a new website than remediate issues afterward. When you realize a live revamped website has hundreds of broken links and fixes them, your rankings will likely suffer. You can avoid the damage with an SEO audit of the website before it goes live.
Preservation Rather than Improvement
A redesign is an excellent time to maintain search success and the optimal time to understand existing SEO weaknesses and tackle them. It only makes sense to reach as many users as possible through a solid organic search strategy. Areas that may require optimization include
- Broken links
- Duplicate content
- Images that are too big
- Missing alt tags, descriptions, and meta descriptions
- Page titles that are too lengthy
- Pages with no backlinks
It is wise to address problems. At the same time, the development work takes place on a refreshed website. It is pointless to push an aesthetically pleasing website live if it is riddled with SEO mistakes that provide a poor user experience.
After launching a website, monitoring it regularly for gains or dips in page rankings, conversion rates, and search traffic is vital. The sooner a problem or underperforming area is spotted, the faster you can fix it.
Content management systems ensure SEO is part of the process from beginning to end, avoiding common SEO mistakes and retaining and boosting a website’s ranking and organic traffic.
7. Google Reviews
It is anticipated that any service or product will receive positive and negative reviews. The number of reviews increases the business’s social proof and contributes to local search engine ranking factors.
Positive reviews improve a site’s credibility, but a website with only positive reviews is suspicious. On the other hand, multiple customers that express dissatisfaction with a website may drive away traffic.
You can use negative reviews to build relationships with customers to understand your audience and gain new visitors to your site. Post the issues you face and the measures you take to handle them on social media.
It will add to the website’s credibility. With the help of social media and industry-specific review sites, you can encourage visitors to post reviews about your services or products and openly acknowledge questions.
8. Google My Business PIN
A common problem is issues with a verification code you receive from a Google My Business account. A solution may be changing a ‘private’ profile to ‘public.’ If that does not work, request a new code. The problem may follow an update of details, such as name, phone number, and address. Make sure the contact information is consistent with the listing.
9. Location-Specific Pages
A Google Report states that websites with ‘near me’ and location-specific keywords have increased, especially with mobile users. Mobile searches that add ‘near me’ have grown by 900 percent in four years.
Of those users, 76 percent use a mobile device for searches for something nearby to visit the business within a day, and approximately 28 percent result in a purchase. Due to the increase in mobile-based consumers, local visibility is critical for many companies.
A crucial component of local and global SEO is high-quality, locally or location-specific optimized content. A brand’s local presence makes ranking in local search results more accessible. This SEO mistake is fixed by having well-optimized, unique location-specific pages to help rank in local search engine result pages.
10. Contact Forms
Of the 1.5 million users we see a form, only 49 percent will start filling it out. Only 16 percent of those people will complete it. Reasons why contact forms do not turn into conversions include:
- Too many fields in the form
- Too many or too few dropdown options
- Non-functioning or unclear submit buttons
To optimize a contact form, make it simple and engaging. Providing fewer form fields helps boost conversion.
- Do not use CAPTCHA
- Enable auto-fill with the Google autocomplete plugin
- Ensure the form is mobile optimized
- Focus on the alignment
- Keep the form simple with five or fewer required fields
- Perform A/B tests on the form for required fields, size, position, color, etc.
- Use a unique, catchy call-to-action
11. Text to HTML Ratio
This is one of the SEO mistakes that does not get as much attention because it is related to technical SEO success. The content on a web page is textual in format. The underlying HTML code includes images, JavaScript, links, headings, etc.
Text to HTML correlates the text content percentage to the HTML code from what search engines discover on a page. The suggested text-to-HTML ratio is from 25 to 70 percent. A low ratio indicates deeply rooted on-page technical SEO errors for the website.
Google uses the metric to check the relevancy of a web page. Low ratios are indications of
- Excessive use of JavaScript, inline styling, and Flash
- Hidden texts – a warning for search bots
- Slow-loading websites due to excessive code
A low ratio is an indicator that page load time improvement is critical. A heavy HTML web page may affect its time to load and the user experience. Add relevant on-page text, use separate files for inline scripts, and remove unnecessary code. Use simple on-page text, so the web page’s size is as minimal as possible. One of the most frequent SEO errors is not checking if Google indexes pages.
12. Indexed Pages
Indexing means including web pages in the stack of searchable Google pages. Google cannot rank what it has not crawled by its search bots. Some reasons pages are not indexed are
- 404 errors
- Duplicate content
- Instances of pages with meta descriptions having improper source code
- Sitemap not up-to-date
The best solution to Google indexing problems is through Google Search Console. Use it to submit a URL inspection that will tell if it is indexed. With the help of the comprehensive website crawler, Deepcrawler tool, you simulate how search bots crawl a website.
You dig deep without affecting the website’s performance and receive a comprehensive URL report. A robots.txt file is used on the pages. After submitting the URL, give Google time to recrawl the page.
Optimize the XML sitemap to not clutter it with multiple broken or redirected pages. Screaming Frog is a tool for crawling websites with a user agent set to ‘YandexBot,’ ‘Baidsuspider,’ ‘DuckDuckBot,’ ‘Slutp,’ Bingbot,’ or ‘Googlebot.’
13. Meta Descriptions and Title Tags
The title tag is an HTML component used to specify the title of a page. They are displayed on a search engine result page. The optimal meta title format is Primary Keyword – Secondary Keyword – Brand Name.
An optimum meta title is 50 to 60 characters, depending on the characters’ width. Google titles are approximately 600 pixels. Besides being displayed on the search engine result page, meta titles appear on social media and the web browser.
A meta description has an optimal length of 50 to 160 characters. Meta descriptions do not directly affect SEO but the click-through rate and the rankings. They have an indirect impact on SEO. Meta description problems include
- Meta description tags are duplicated
- Lengthy meta descriptions that cause missing targeted keywords
- Use of alphanumeric values in meta descriptions
Add a clear call-to-action in the meta description. Use targeted keywords to ensure the title tags and meta description are written within the optimal length to prevent the loss of keywords caused by Google truncation.
Do not over-optimize title tags with SEO keywords. Use vertical bars to separate keywords in a title tag. Be sure the title can be easily understood by viewers and not have a lot of keywords that do not make sense.
Even if you did not add meta description tags, social media sites might register the text first found on a page, which may not create a good user experience. Make sure the title tags and meta descriptions are unique for all pages.
14. URLs
A URL is a resource locator—the web address specifies the web page’s location on the internet. URLs are ideally shorter than 512 pixels, or 2083 characters, to avoid Google truncation and render the address correctly in all browsers.
URLs provide more search visibility and a better user experience that benefits SEO. Some of the major components that make a URL less readable and cause an audience to lose interest are
- Too many stoppers or connectors
- Special characters, such as #, %, and <space>
- Use of HTTP rather than HTTPS
- Including more than two folders in a URL
A well-crafted URL benefits search engines and readers. The significance of URL readability is illustrated by the following:
- Readable URLs make links more clickable.
- Shorter URLs rank better than longer ones, which helps drive direct traffic.
- HTTPS is a secure HTTP version. The information on the website is encrypted, which significantly heightens security.
- Relevant keywords must be in the URL.
- Hyphens separate the keywords in a URL instead of spaces.
15. Faulty Redirects
The placement of redirects may cause a loss of traffic and SEO value. Consider cleaning it up if a redirect is of little or no value. When many 301 redirects build up over time, they do not produce the SEO value they did at one time.
It increases load time and bounce rate. Solutions include
- Redirecting HTTP versions of pages to HTTPS
- Run the site through Screaming Frog. Under the Bulk Export menu, use the All Inlinks report. It will tell you all the internal links from your sitemap that have 301 redirects for you to fix.
- Fix the broken redirects on the site structure
- Redirect 404 pages
Google Search Console
Google Search Console is a helpful tool for understanding how websites perform and what can be done to improve their appearance on searches on Google to bring more relevant traffic to websites.
It provides information on how the Google search engine crawls, indexes, and serves sites. The report helps website owners monitor and optimize the performance of searches. You do not have to log in daily.
When Google finds issues on a site, the website owner receives an email to alert them. It is still wise to check the account every month and if there have been changes to the website’s content to ensure the data is stable.
Keyword Research
The constant evolution of how search engines rank web pages keeps you on your toes. SEO has caused a shift in keyword research to a deeper understanding of the meaning of words in different content, particularly as part of overall topics.
Strategically choosing topics and phrases is still essential, but they are used differently than in the past. Search engines once relied heavily on backlinks and plain text data to determine monthly refreshed rankings. Google is now a sophisticated product with many algorithms designed to promote results and content that offers searchers a helpful user experience.
Search Engines
Most people rely on one or two search engines they use to find
- Helpful options to tighten or broaden a search
- Relevant results
- An easy-to-read, uncluttered interface
Google is the most used search engine. It is relevant, fast, and has the most extensive catalog of available web pages. Others include
- Duck Duck Go Search
- Bing Search
- Dogpile Search
- Google Scholar Search
- Webopedia Search
- Yahoo Search
- The Internet Archive Search
When improving SEO, focus on increasing user engagement and high-quality content. Some SEO errors are human. A regular site audit ensures a website is in good health. ADA Site Compliance would be happy to help.
When the website redesign is launched, check the Google cache to ensure the website is crawled correctly. Never assume it is crawlable until it appears in the Google index. Google allows a specific time window to correct the SEO perspective before the site falls out of the rankings. The time-based algorithm will not send websites to searchers if the sites do not work on Google.
Be prepared to use the last publishing to restore traffic and ranking if the website redesign is causing a problem. Correct any issues before you time out again. When the new website appears in the Google cache, click between the ‘Full version’ and ‘Text-only version’ to ensure everything seen on the full version is seen on the text-only version.
Contact ADA Site Compliance today for all your ADA website compliance and web accessibility needs!
Share via:
Speak With An Expert Now
Have a question?
We’re always here to help.
The ADA prohibits any private businesses that provide goods or services to the public, referred to as “public accommodations,” from discriminating against those with disabilities. Federal courts have ruled that the ADA includes websites in the definition of public accommodation. As such, websites must offer auxiliary aids and services to low-vision, hearing-impaired, and physically disabled persons, in the same way a business facility must offer wheelchair ramps, braille signage, and sign language interpreters, among other forms of assistance.
All websites must be properly coded for use by electronic screen readers that read aloud to sight-impaired users the visual elements of a webpage. Additionally, all live and pre-recorded audio content must have synchronous captioning for hearing-impaired users.
Websites must accommodate hundreds of keyboard combinations, such as Ctrl + P to print, that people with disabilities depend on to navigate the Internet.
Litigation continues to increase substantially. All business and governmental entities are potential targets for lawsuits and demand letters. Recent actions by the Department of Justice targeting businesses with inaccessible websites will likely create a dramatic increase of litigation risk.
Big box retailer Target Corp. was ordered to pay $6 million – plus $3.7 million more in legal costs – to settle a landmark class action suit brought by the National Federation of the Blind. Other recent defendants in these cases have included McDonald’s, Carnival Cruise Lines, Netflix, Harvard University, Foot Locker, and the National Basketball Association (NBA). Along with these large companies, thousands of small businesses have been subject to ADA website litigation.
Defendants in ADA lawsuits typically pay plaintiff's legal fees, their own legal fees for defending the litigation, and potential additional costs. In all, the average cost can range from tens of thousands of dollars, to above six figures. There are also high intangible costs, such as added stress, time and human capital, as well as reputational damage. Furthermore, if the remediation is incomplete, copycat suits and serial filers can follow, meaning double or triple the outlay. It's vital to implement a long-term strategy for ensuring your website is accessible and legally compliant.