Seven tips on search engine optimization for edu sites

Writing and designing for today’s web is about writing and designing for humans and robots.

Content must not only pique the interest and meet the needs of a site’s human visitors, but it must be equally nutritious to the Pac Man appetites of the search engines of Google, Yahoo, and others. Every search engine has an application — crawler, spider, or bot — that finds and follows the links of your site, sending back a list to a database, which is then analyzed by the search engine’s proprietary algorithm to rank your pages and serve up a search engine results page (or SERP, in the lingo). Optimizing your site for search requires having content, structure, and technology that help both the crawler move over the site and the search engine rank the results.

Search optimization is a mix of science and magic and a continually moving target. Even the search engine optimization (SEO) experts don’t agree on all techniques. And an .edu site is going to have different priorities for SEO than a business site that exists for e-commerce. But since many SEO techniques are easy and actually encourage good organization and content practices, adopting them as part of your site development and maintenance guidelines is not a waste of time, particularly in an era of belt-tightening for traditional marketing budgets. Here are seven tips gleaned from the realm of SEO to get you started:

1. Use descriptive page titles — We’re talking the TITLE tag here, and it’s one of the first things that both humans and robots encounter. Make sure your pages have descriptive titles with keywords. Simply repeating the name of your institution on every page is not sufficient. Each page should have a unique page title, with keywords (but not stuffed with keywords) relevant to the page content, front loaded with the words that matter most. Aim for no more than 66 characters and use title case.

2. Put effort into the most valuable meta tags — The meta description tag deserves your attention, not necessarily for its influence over rankings, but because its content can be what search engines display on SERPs. You want to control that display, not leave it up to the crawler’s best guess. It should be around 160 characters and be unique to each page. The meta keyword tag has been so abused with spamming that it has low to zero influence on search engines. If you use it, it should be different on every page. Simply repeating the same words in the keywords tag on every page of your site may look more spammy than legitimate to a search engine.

3. Use heading tags — Headlines are looked at with more importance by crawlers than body text. The H1, H2, and so on tags are a way to indicate headlines and their relative importance to search engines. In the dark ages before CSS, we were saddled with fixed heading sizes that were often too big or too small or otherwise ugly in the layout, so we sometimes used other tags (or even images, gasp) to style headlines. Now with CSS we can visually style H tags any way we like, and they can be used to add robot-readable structure to a web page. Heads should be both descriptive and have relevant keywords when possible.

4. Write one topic per page — This is a tip followed by most pro content developers. Not only does it help your human readers, but the algorithms that search engine crawlers use work best on one topic at at time. Keeping focused in your writing also makes it easier to come up with keywords and meta descriptions for a page. Since you’re sticking to one topic per page, you can also keep it short and get to to the point quickly, right? Headlines, subheads, and concise paragraphs are good SEO writing, and consistency among those helps search engine crawlers (and humans) understand your content.

5. Don’t be lazy with your links — The anchor text of a link gives descriptive information about the content of the link’s destination page and can influence search engine rankings. Lazily written “click here” links, for example, tell nothing about the destination page, but may get you a top ranking for “click here.” (Search “click here” in Google. Hello, Adobe.) Use keywords in the link text that are relevant to the destination page. And while you’re at it, pay attention to linking to the PDFs, videos, images, and similar assets that are all part of today’s sites. For example, a search engine cannot tell that’s the spring commencement video unless the link to it says “spring commencement video,” and it is placed next to text in the page about spring commencement. And, by the way, for similar reasons make sure all your images (including logos and images used as buttons) have appropriate text in their ALT tags.

6. Understand the search implications of technologies —
This is a whole topic unto itself, but be aware of search implications of your technical choices. Flash for example, has improved in its ability to be indexed and to allow search engines to find the content and links embedded within Flash objects. But it’s unlikely that search engines will open themselves up to full compatibility with Flash, because that would also open the door to being gamed by an unethical optimizer. Current search engines don’t generally index Flash content on par with HTML. HTML pages will get ranked higher.

There are similar challenges with AJAX and JavaScript. Search engines can’t deal very well with the dynamic and “pageless” content that can be enabled by these technologies. The functional and stylish enhancements that JavaScript can bring to a site’s navigation can also block a search engine’s ability to build a model of the site’s link structure. Search engines can only see the initial page load. If AJAX is used to later alter that content, the new content won’t be seen by a search engine. There are techniques to deal with these issues that you may want to consider.

And finally, with the growing popularity of content management systems (CMS) in education, institutions are faced with a whole slew of additional considerations that affect search. For example, it’s not uncommon to “restart” a site within a CMS, generating a new URL structure for all the content. Search engines, however, have indexed your site using the previous URLs. You are effectively starting over at ground zero with search engines when you flip the CMS switch. Content management systems can also generate problematic URLs along with cloned and duplicated content, which also don’t make search engines happy. If you’re considering a CMS, questions to the vendor about how it supports search are in order.

7. Bring back the site map — And finally, have a good old site map page, a hierarchical list of all the links of the site. The popularity of providing site maps has waned, but they are good for SEO. For one thing, such an alternative link structure can help make up for issues being caused by JavaScript, AJAX, and other crawler blockers. Also including the site map’s links at /sitemap.xml and /sitemap.txt can help search engines understand your site’s structure.

Like all things web, developing for search optimization is a balance between human needs and the needs of technology. It can be challenging, but in many cases what works well for one — structure, conciseness, explanation, consistency — also benefits the other. Search optimized content can be a win win for human and robot.


This post originally appeared in Stein Communications The Scoop

Leave a Reply

You must be logged in to post a comment.