How I scored 100 on HubSpot's Website Grader



It was never really my intention to get a perfect score for on HubSpot's WebSite Grader.  Sure, I have an obsession for performance and secretly wanted to get it there, but I didn't exactly sit down with the intention of doing it.  In fact, scoring 100 on is a result of testing, investigating, and implementing suggested solutions from every possible SEO tool I could find out there.  So many of these SEO report tools focus on different aspects of a site in determining the "grade" including performance, security, social media, structured data and more.  I'll save all of my findings and thoughts on the other SEO tools out there for another post.  As for now, I will focus on how I achieved this score on


Website Grader gives you all the information needed to identify the areas for improvement, but I found that their links to "learn more" didn't always provide me with a proper solution.  What wound up happening is I would spend hours at a time researching these items and the numerous proposed solutions floating around out there on the web. 





Pretty straightforward here; mind your code.  At one point I thought it would be a good idea to inline all of my SVGs (scalable vector graphics) to reduce requests, but in fact, I bloated my html with extra code that didn't need to be there.  We'll tie it in with compression here in a minute, but bottom line is cut out the bloat and call those below the fold.

Beyond page size,  there's some SEO weight behind code to text ratio to keep in mind as well.  If your code significantly outweighs your content, it can have a negative impact.




Here, its all about keeping things together. Combine your files into as few as possible to avoid any unnecessary trips back to the server.   Google likes to see a page render the 'above the fold' content as fast as possible, with 'above the fold' being the content that is visible when the page first loads (like your nav bar, banner, logo, etc.).  If you have a lot of scripts / style sheets in the <head>, then those are all downloaded in full prior to loading the page.  You can work around third party scripts with the async tag:

<script async src=""></script>




Your page speed will reflect your efforts in the 2 previous items.  If your page size is small and your requests are low, then the page will load quickly.




To enable browser caching for a website hosted in IIS, you'll need to modify the Web.config file.  Note that I have set cacheControlMaxAge="7.00:00:00", this is where you specify in days:hours:minutes the amount of time to allow the browser to cache.  Since I am still modifying things throughout the weeks, I only allow static content to be cached for 7 days right now.  

<!-- add to <system.webServer> section -->
      <clientCache cacheControlMode="UseMaxAge" cacheControlMaxAge="7.00:00:00" />




Likely not an issue.  If you're redirecting from an old domain to a new one, you should be reviewing the new domain name anyway. 





Compression was a fun one.  This will assist in page speed and overall performance because the content is being compressed before its delivered to the browser.  We already had compression turned on.  However, in the case of, I used SVG graphics almost exclusively.  One thing I didnt know...IIS doesn't compress SVG graphics by default. So, we had to modify the applicationHost.config to include the image/svg+xml mimeType.

<!-- ADD TO -->
<httpCompression directory="%SystemDrive%\inetpub\temp\IIS Temporary Compressed Files">
    <scheme name="gzip" dll="%Windir%\system32\inetsrv\gzip.dll" />
<!-- ######## -->
            <!-- other mimeTypes -->
            <add mimeType="image/svg+xml" enabled="true" />
            <!-- other mimeTypes -->
            <add mimeType="image/svg+xml" enabled="true" />





There are a lot of opinions and suggested best practices on how to do this.  The way I accomplished this was to inline some critical above the fold CSS in the <head> and then place the <link> to CSS at the bottom of the page.  When you first visit, the text loads as a default font for less than a second before the css is downloaded and overrides the default. 




Use percentages for padding, margin, width, and height where possible.  Not only does this allow the content to scale fluidly with the screen size, but also enables you to keep your CSS smaller with fewer class adjustments under @media rules.




Because high DPI screens are becoming more common, it's now more important than ever to declare the viewport in the <head> of your document.  It wont compensate for display scaling set by the OS, but it will compensate for high DPI displays wanting to render as a standard display.  If you want users to be able to pinch-zoom on mobile devices, ditch maximum-scale=1.

<meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1" />




First, make sure every page has a title.  Then, make sure your titles are optimized for search engine results. Google typically displays the first 50-60 characters of your title.  If the title is longer than that, you wind up with the title cut off by ellipses... 





Similar to Page Titles, the meta description is the block of text that shows under your title in search engine results.  For it to be relevant and captivating, you want to write it yourself and keep it under 160 characters.  If you don't have a meta description in your head, google will just pull the first 160 characters of text it encounters. 




H tags provide a hierarchy for search engines as well as readers.  However, there is a lot of conflicting info out there about what is the best way to use your H tags.  Some places say no more than 5 <h1> tags, others say only 1 <h1> tag.  I think it's debatable, but I ultimately went with 1 <h1> and then multiple <h2>, <h3>, and <h4> tags.  




Another straightforward item.  Mainly, use them to submit your site to google and bing via search console and webmaster tools.




In the interest of security, everyone should be using an SSL certificate these days.  I read something the other day about a website not using HTTPS that was transmitting credentials in plain-text.  Now, the accounts had no real function other than turning on visibility to some general content, but these users are more than likely using the same email addresses and passwords for other sites.  So any ill-willed individual wise to that knowledge could theoretically watch the traffic and nab all login credentials.  From there, those accounts could be tested to login to social media accounts, financial institutions, etc.

The moral of the story being: get an SSL certificate. You may be unintentionally putting your users at risk.

If you're already hosting with Rebel IST,  we make this easy and affordable with SSL certificates starting at just $14.95. 

Comments are closed