Extra Media Marketing - Web Design Fort Myers, FL

Web Marketing Tips

Web Marketing Tips

Extra Media Marketing joins forces with Atilus

Monique Engicht - Tuesday, July 10, 2012

Through an agreement crafted in June, 2012, Extra Media Marketing, Inc., has decided to join forces with another web design firm, Atilus LLC., located also in Southwest Florida and will be doing business as Atilus, LLC. from this point forward.  

Extra Media Marketing believes firmly that this step allows for accelerated growth and opens even more doors in terms of opportunities for their current clientele. 

Please visit Atilus.com for more information or contact Monique@atilus.com

10 SEO Techniques for Maximum Web Marketing for Your Business

Monique Engicht - Monday, April 02, 2012

So you have a website, now what? It's not like potential customers will know your super-fancy.com domain by memory. You need to be found by search engines and while there are many out there, most of your traffic will come from Google.com. 

For success in getting crawled and ranked high in search engines, you need to know the basics of SEO = Search Engine Optimization. Basically means what tactics need to be done to get found by search engines. If you are recently designing/redesigning your website, working for yourself or a developer, there's a great add-on for Firefox (firebug) that can assist with review of most of the suggestions below. It's important to take ownership of the project to make sure that acceptable and up-to-date standards are being followed, so your website is not out-of-date right from the beginning. 

Here are some guidelines to help your website web pages and relevant content for your audience to get found online: 

1. Meta Keywords Tag 

The Meta keywords tag is largely ignored by search engines today. However, this tag can still have a small impact on search engine rankings. The content of your Meta keywords tag should be approximately 3 to 5 keywords relevant to the page. Simply  no point in stuffing it with dozens of keywords.

2. Meta Description Unique to Each Page of your Website

When writing the contents of your Meta description you’ll find that the best performing contents are those that are unique, not a one-size fit all and accurately summarize the content of the page. 

3. Meta Description Tag Appropriate Length

Search engines will read up to 250 characters of the content in your Meta description tag, Google only displays the first 160 characters of the content found in your Meta description tag. Keep the contents of that tag to a maximum of 160 characters and write your Meta description to be short, keyword rich and in a way that will incentivize people to click on your search result. Google yourself from time to time to review how attractive are your results when they come up on search engines.

4. Title Tag Length & Uniqueness

Your title tag content should be no longer than 70 characters in length. Search engines only display up to the first 70 characters of the title tag in your page. The title tag in your page carries a lot of weight in the search engine algorithm. For your title tag to perform at its best it should be unique and clearly indicate the content of your page - do not keep standard titles for all pages as this will cost you traffic.

5. Flash 

Search engines cannot read the contents of a Flash file. I highly recommend alternate ways of displaying whatever content you have it in flash since most tablets/phones are also unable to display it. Search engines can’t see if there is any text in the Flash movie or what that text might say. If you use Flash on your page, make certain to include lots of descriptive text in close proximity to where the Flash is displayed in your page layout.

6. Frames

Frames in their most basic configuration consist of one web page within another web page. Search engines cannot make a determination as to the relationship of each page within frames. Search engines may find out about an individual web page through a direct link to that page, but typically the content inside of a frame cannot be indexed by the search engines through the frameset tag within the source code. Avoid using frames if you can.

7. Images and Graphics as Text

Search engines are unable to read text embedded into an image file and are unable to determine a relationship with a graphic that is hyperlinked to another page or file. For that reason, you need to ensure that you use a descriptive ALT text attribute in your IMG statement or a CSS image replacement method. Do not have any image in your page that does not contain an ALT tag. 

8. Hidden Text

If you have hidden text on your page using CSS display tricks or if the font color is the same as the background color and you’re doing that because you think it will help your search engine rankings, remove the text from your page immediately. Google may block your page from the search engine results and it will take a while to be re-instated. Hidden text should be avoided unless it’s displayed through a toggle button tool tip or navigation interaction.

9. Tables  in Markup

HTML table-based layouts are not nearly as efficient as a CSS-based layout. The presence of HTML tables within your source code is not going to have a significantly negative impact on your search engine rankings, but they do create more code than a CSS-based layout which impacts the file size of your page. CSS-based layouts are also more beneficial because they present your content in a format that makes it easier for search engines to understand the relationship between all of the elements on your page.

10. HTML Sitemap & XML Sitemap

Your HTML sitemap page should have links to all of the web pages within your website. Use keyword rich, descriptive anchor text in the links. Link to your HTML sitemap page from the footer or header of all pages on your website and you’ll be providing your visitors with a way to find content on your website and you’ll help the search engines to find all of your content more easily too.

XML sitemaps are more for machines to read than for humans. They are the most ideal way to inform the search engines which pages you would like for them to index. XML sitemap files can help to get pages crawled and indexed.


Southwest Florida Local Charities

Monique Engicht - Thursday, March 25, 2010
In search of local non-profit organizations around Lee County I decided to Create a comprehensive list including some very active charities in our area.  Just in case you have a warm heart today :)

Fort Myers Car Donation to Charity

The non-profit organization that you select on the Vehicle Donation Form will benefit. To see charities and their mission statements, please click here to Search and View Charity Mission Statements. You may select any one of the charities on our list regardless of your location.

Harry Chapin Food Bank - Fort Myers, FL

The mission of the Harry Chapin Food Bank is “to overcome hunger in Charlotte, Collier, Glades, Hendry, and Lee counties through education and by working in a cooperative effort with affiliated agencies in the procurement and distribution of food, equitably and without discrimination.”

Goodwill Industries of SW Florida

Goodwill Industries of Southwest Florida serves residents in Lee, Collier, Charlotte, Hendry, and Glades counties with the mission of helping people with disabilities and other disadvantages overcome their barriers to employment and independence.

Salvation Army of Lee County

Join us to make a difference. With your help, The Salvation Army will continue assisting those who are homeless, abused or disadvantaged in pursuit of its goals: serving the most people, meeting the most needs, DOING THE MOST GOOD!

It's All About Relationships

Monique Engicht - Tuesday, March 16, 2010
While recently reading one of my favorite blogs, ProBlogger, on a post he called “Why Your Business Need Friends” an idea that has resonated with me for the past year has become clear.

In 2006 when the Real Estate Market was booming and Southwest Florida was the fastest growing county in the country, I met an overachieved young Realtor who was simply thriving in the mist of the RE hype.  Being the naturally curious person I am, I could not help but inquisitively inquiry “How do you do it?”, (referring to the fact he had over 120 houses listed in Cape Coral at the time). Which to my deception at the time was answered with “I just make friends. Every day I go out and make friends.”

As a young and aspiring 21 year old, searching for the magic secret of having hundreds of listings and therefore thousands of dollars in the bank, this oversimplified answer simply didn’t make sense. It didn’t make any sense.  Make friends? How can that bring me money? Whew what a loser, I thought to myself about this self-made Real Estate guru.  He has accomplished so much and this is the best he can come up with?

Feeling like a red headed stepchild, like i was not important enough for this guy to give me a grown up answer, I did not waste time thinking about it. Only now, 3 years after this episode, and a little more experience later, I am craving to read a book on business relationships much talked about: "Never Eat Alone”

Just as Darren mentions in his blog post, the book talks about having meaningful relationships with your peers, clients, colleagues and treating each person as an independent individual, with its own desires and objectives. Maybe this relationship initiative we take with each person/prospect that walks into our lives comes easily to some, but not to me. 

Lately, as I try to move into a higher level of business consultancy, I realize that to become a prominent consultant we need to take a legitimate interest in our clients business processes and accompanying issues. 

Taking a legitimate interest in our clients business is hard!

Man are egotistical by nature, and taking an interest in analyzing our clients business process to critically think our way through offering a custom tailored and worthwhile solution is not easy. Hopefully each client/prospect we meet helps us metamorphose into this "friendly persona" so much heard of who is simply brimming with work.

Search Engine Optimization Tips

Monique Engicht - Thursday, December 10, 2009

15 Minute SEO Audit

The basics of SEO problem identification can be done in about 15 minutes. When completing this audit I recommend you take notes based on the action items listed in each section. This will help you later when you do a deeper dive of the website. This audit is not comprehensive (See Chapter 9 for a full annotated site audit), but it will help you quickly identify major problems so you can convince your clients that your services are worthwhile and that you should be given a chance to dig deeper. The smart ones reading this section may notice that it builds upon the ideas expressed in Chapter 2. The dumb ones reading this, will think it is Harry Potter. The latter might enjoy it more but the former will end up with better SEO skills.

Prepare Your Browser

Before you start your audit you need to set your browser to act more like the search engine crawlers. This will help you to identify simple crawling errors. In order to do this, you will need to do the following:

  • Disable cookies in your browser
  • Switch your user-agent to Googlebot

How Do I Do This and Why Is It Important?

When the search engines crawl the Internet they generally do so with a user-agent string that identifies them (Google is googlebot and Bing is msnbot) and in a way where they don't accept cookies.

To see how to change your user-agent go to Chapter 3 (Picking the Right SEO Tools) and see user-agent switcher. Setting your user-agent to Googlebot increases your chance of seeing exactly what Google is seeing. It also helps with identifying cloaking issues (Cloaking is the practice of showing one thing to search engines and a different thing to users. This is what sarcastic Googlers call penaltybait. ) In order to do this well, a second pass of the site with your normal user-agent is required to identify difference. That said, this is not the primary goal for this quick run through of the given website.

In addition to doing this you should also disable cookies within your browser. By disabling them, you will be able to uncover crawling issues that relate to preferences you make on the page. One primary example of this is intro pages. Many websites will have you choose your primary language before you can enter their main site. (This is known as an intro page.) If you have cookies enabled and you have previously chosen your preference, the website will not show you this page again. Unfortunately, this will not happen for search engines.

This language tactic is extremely detrimental from a SEO perspective because it means that every link to the primary URL of the website will be diluted because it will need to pass through the intro page. (Remember, the search engines always see that page as they can't select a language) This is a big problem, because as we noted in Chapter 1, the primary URL (i.e. www.example.com/) is usually the most linked to page on a site.

Homepage

Next, go to the primary URL of the site and pay particular attention to your first impression of the page. Try to be as true to your opinion as possible and don’t over think it. You should be coming from the perspective of the casual browser (This will be made easier because at this point you probably haven’t been paid any money and its a lot easier to be casual when are not locked down with the client) Follow this by doing a quick check of the very basic SEO metrics. In order to complete this step, you will need to do the following:

  • Notice your first impression and the resulting feeling and trustworthiness you feel about the page
  • Read the title tag and figure out how it could be improved
  • See if the URL changed (As in you were redirected from www.example.com/ to www.example.com/lame-keyword-in-URL-trick.html)
  • Check to see if the URL is canonical

How Do I Do This and Why Is It Important?

The first action item on this list helps you align yourself with potential website users. It is the basis for your entire audit and serves as a foundation for you to build on. You can look at numbers all day, but if you fail to see the website like the user, you will fail as an SEO.

The next step is to read the title tag and identify how it can be improved. This is helpful because changing title tags is both easy (A big exception to this is if your client uses a difficult Content Management System.) and has a relatively large direct impact on rankings.

Next you need to direct your attention to the URL. First of all, make sure there were not redirects that happened. This is important because adding redirects dilutes the amount of link juice that actually makes it to the links on the page.

The last action item is to run a quick check on canonical URLs. The complete list of URL formats to check for is in Chapter 2 (Relearning How You See the Web). Like checking the title tag, this is easy to check and provides a high work/benefit ratio.

Secret:

Usability experts generally agree that the old practice of cramming as much information as possible “above the fold” on content pages and homepages is no longer ideal. Placing a “call to action” in this area is certianly important but it is not necessary to place all important information there. Many tests have been done on this and the evidence overwhelmingly shows that users scroll vertically (especially when lead).

Global Navigation

After checking the basics on the homepage, you should direct your attention to the global navigation. This acts as the main canal system for link juice. Specifically, you are going to want to do the following:

  • Temporarily disable Javascript and reload the page
  • Make sure the navigation system works and that all links are HTML links
  • Take note of all of the sections that are linked to
  • Re-enable Javascript

How Do I Do This and Why Is It Important?

As we discussed in Chapter 2 (Relearning How You See the Web), site architecture is critical for search friendly websites. The global navigation is fundamental to this. Imagine that the website you are viewing is ancient Rome right after the legendary viaduct and canal systems were built. These waterways are exactly like the global navigation that flows link juice around a website. Imagine the impact that a major clog can have on both systems. This is your time to find these clogs.

Your first action item in the section is to disable Javascript. This is helpful because it forces you to see your website from the perspective of a very basic user. It is also a similar perspective to the search engines.

After disabling Javascript, reload the page and see if the global navigation still works. Many times it won’t and it will uncover one of the major reasons the given client is having indexing issues.

Next view source and see if all of the navigational links are true HTML links. Ideally, they should be because they are the only kind that can pass their full link value.

Your next step is to take note of which sections are linked to. Ideally, all of the major sections will be linked in the global navigation. The problem is, you won’t know what all of the major sections are until you are further along in the audit. For now just take note and keep a mental checklist as you browse the website.

Lastly, re-enable Javascript. While this will not be accurate with the search engine perspective, it will make sure that AJAX and Javascript based navigation works for you. Remember, on this quick audit, you are not trying to identify every single issue with the site, instead you are just trying to find the big issues.

Secret:

The global navigation menus that are the most search engine friendly appear as standard HTML unordered lists to search engines and people who don't have Javascript and/or CSS enabled. These menus use HTML, CSS pseudo-classes and optionally Javascript to provide users feedback on their mouse position. You can see an example of this in Chapter 9.

Category Pages/Subcategory Pages (If applicable)

After finishing with the homepage and the global navigation, you need to start diving deeper into the website. In the waterway analogy, category and subcategory pages are the forks in the canals. You can make sure they are optimized by doing the following:

  • Make sure there is enough content on these pages to be useful as a search result alone.
  • Find and note extraneous links on the page (there shouldn’t be more than 150 links)
  • Take notes on how to improve the anchor text used for the subcategories/content pages
How Do I Do This and Why Is It Important?

As I mentioned, these pages are the main pathways for the link juice of a website. They help make it so if one page (most often the homepage) gets a lot of links, that the rest of the pages on the website can also get some of the benefit. The first action point requires you to make a judgment call on whether or not the page would be useful as a search result. This goes with my philosophy that every page on a website should be a least a little bit link worthy. (It should pay its own rent, so to speak) Since each page has the inherent ability to collect links, webmasters should put at least a minimal amount of effort into making every page link worthy. There is no problem with someone entering a site (from a search engine result or other third party site) on a category or subcategory page. In fact, it may save them a click. In order to complete this step, identify if this page alone would be useful for someone with a relevant query. Think to yourself:

  1. Is there helpful content on the page to provide context?
  2. Is there a design element breaking up the monotony of a large list of links?

Take notes on the answers to both of these questions.

The next action item is to identify extraneous links on the page. Remember, from Chapter 2 we discussed that the amount of link value a given link can pass is dependent on the amount of links on the page. To maximize the benefit of these pages, it is important to remove any extraneous links. Going back to our waterway analogy, this type of links are the equivalent “canals to nowhere”. (Built by the Roman ancestors of former Alaskan Senator Ted Stevens)

To complete the last action item of this section, you will need to take notes on how to better optimize the anchor text of the links on this page. Ideally, they should be as specific as possible. This helps the search engines and users identify what the target pages are about.

Secret:

Many people don’t realize that category and subcategory pages actually stand a good chance of ranking for highly competitive phrases. When optimized correctly, these pages will have links from all of their children content pages, the websites homepage (giving them popularity) and include a lot of information about a specific topic (relevancy). Combine this with the fact that each link that goes to one of their children content page also helps the given page and you have a great pyramid structure for ranking success.

Content Pages

Now that you have analyzed the homepage and the navigational pages, it is time to audit the meat of the website, the content pages. In order to do this, you will need to complete the following:

  • Check and note the format of the Title Tags
  • Check and note the format of the Meta Description
  • Check and note the format of the URL
  • Check to see if the content is indexable
  • Check and note the format of the alt text
  • Read the content as if you were the one searching for it
How Do I Do This and Why Is It Important?

The first action item is to check the title tags of the given page. This is important because it is both helpful for rankings and it makes up the anchor text used in search engine result. You don’t get link value from these links but they do act as incentives for people to visit your site.

Tip:

SEOmoz did some intensive search engine ranking factors correlation testing on the subject of title tags. The results were relatively clear. If you are trying to rank for a very competitive term, it is best to include the keyword at the beginning of the title tag. If you are competing for a less competitive term and branding can help make a difference in click through rates, it is best to put the brand name first. With regards to special characters, I prefer pipes for aesthetic value but hyphens, n-dashes, m-dashes and subtraction signs are all fine. Thus, the best practice format for title tags is one of the following:

  • Primary Keyword - Secondary Keywords | Brand
  • Brand Name | Primary Keyword and Secondary Keywords
See http://www.seomoz.org/knowledge/title-tag/ for up-to-date information

Similarly to the first action item, the second item has to do with a metric that is directly useful for search engines rather than people (they are only indirectly useful for people once they are displayed by search engines.) Check the meta description by viewing source or using the mozBar and make sure it is compelling and contains the relevant keywords at least twice. This inclusion of keywords is useful not for rankings but because matches get bolded in search results.

The next action item is to check the URL for best practice optimization. Just like Danny Devito, URLs should be short, relevant and easy to remember.

The next step is to make sure the content is indexable. To ensure that it, make sure the text is not contained in an image, flash or within a frame. To make sure it is indexed, copy an entire sentence from the content block and search for it within quotes in a search engine. If it shows up, it is indexable.

If there are any images on the page (as there probably should be for users sake) you should make sure that the images have relevant alt text. After running testing on this at SEOmoz, my co-workers and I found that relevant anchor text was highly correlated to high rankings.

Lastly and possibly most importantly, you should take the time to read the content on the page. Read it from the perspective of a user who just got to it from a search engine result. This is important because the content on the page is main purpose for the page existing. As an SEO, it can be easy to become content-blind when doing quick audits. Remember, the content is the primary reason this user came to the page. If it is not helpful, vistors will leave.

Links

Now that you have an idea of how the website is organized it is time to see what the rest of the world thinks about it. To do this, you will need to do the following:

  • View the amount of total links and the amount of root domains linking to the given domain
  • View the anchor text distribution of inbound links

How Do I Do This and Why Is It Important?

As you read in Chapter 1 (Understanding Search Engine Optimization), links are incredibly important in the search engine algorithms. Thus, you cannot get a complete view of a website without analyzing its links.

This first action item requires you to get two different metrics about the inbound links to the given domain. Separately, these metrics can be very misleading due to internal links. Together, they provide a fuller picture that makes accounting for internal links possible and thus more accurate. At the time of writing, the best tool to get this data is through SEOmoz’s Open Site Explorer.

The second action item requires you to analyze the relevancy side of links. This is important because it is a large part of search engine algorithms. This was discussed in Chapter 1 (Understanding Search Engine Optimization) and proves as true now as it did when you read it earlier. To get this data, I recommend using Google’s Webmaster Central.

Search Engine Inclusion

Now that you have gathered all the data you can about how the given website exists on the internet, it is time to see what the search engines have done with this information. Choose your favorite search engine (you might need to Google it) and do the following:

  • Search for the given domain to make sure it isn’t penalized
  • See roughly how many pages are indexed of the given website
  • Search three of the most competitive keywords that relate to the given domain
  • Choose a random content page and search the engines for duplicate content

How Do I Do This and Why Is It Important?

As an SEO, all of your work is completely useless if the search engines don’t react to it. To a less degree this is true for webmasters as well. The above action items will help you identify how the given website is reacted to by the search engines.

The first action item is simple to do but can have dire affects. Simply go to a search engine and search for the exact URL of the homepage of your domain. Assuming it is not brand new, it should appear as the first result. If it doesn’t and it is an established site, it means it has major issues and was probably thrown out of the search engine indices. If this is the case, you need to identify this clearly and as early as possible.

The second action item is also very easy to do. Go to any of the major search engines and use the site command (as defined in Chapter 3) to find roughly all of the pages of a domain that are indexed in the engine. For example, this may look like site:www.example.com. This is important because the difference between the number that gets returned and the number of pages that actually exist on a site says a lot about how healthy a domain is in a search engine. If there are more pages in the index than exist on the page, there is a duplicate content problem. If there are more pages on the actual site than there are in the search engine index, then there is an indexation problem. Either are bad and should be added to your notes.

The next action item is a quick exercise to see how well the given website is optimized. To get an idea of this, simply search for 3 of the most competitive terms that you think the given website would reasonably rank for. You can speed this process up by using one of the third party rank trackers that are available. (Refer back to Chapter 3)

The final action item is to do a quick search for duplicate content. This can be accomplished by going to a random indexed content page on the given website and search for either the title tag (in quotes) or the first sentence of the content page (also in quotes). If there is more than one result from the given domain, then it has duplicate content problems. This is bad because it is forcing the website to compete against itself for rankings. In doing so, it forces the search engine to decide which page is more valuable. This decision making process is something that is best avoided because it is difficult to predict the outcome.

FROM: SEOMOZ