Fonts: Where? How? Why?

Fonts can give a page personality. They can be quirky, romantic, silly and no nonsense. Remember though as a straight font call your viewers will only see what fonts they also share on their machines. So for the basics you are generally limited to a few universal families of fonts. But, for headers and other splashy areas you can create gifs or pngs using any font you wish to add some character to your pages.

Naturally the question then becomes how do I find cool fonts to play with? That can be as easy as typing fonts into your favorite search engine. There are literally thousands of free fonts out there. Do mind your basic browser safety though and do not visit sites that your browser tells you are questionable.

Once you are on a site try checking out some styles. This can be a very overwhelming process, so it is usually best to have a style in mind to begin with. You might also want to set up a limit on how many fonts you will allow yourself to download on a trip to the font market. The possibilities are endless for the types of cursive, printed, typed, painted, symbolic and image based fonts you can find for free online. Have fun checking things out.

Now that you have found a free font you like; how do you add it to your computer? PC users simply right click on the font and choose save file as, and save the file to their hard drive. Depending on what version of windows depends on how you install this font. First go to control panel and open up the fonts folder. Look first to see if the upper menu has an install font option. If it does use this to install the font(s). If it does not simply copy the font fro where it was saved to the font folder.

Mac users download the font by clicking on the download button and saving it to their hard drive. Install the font by first closing all running programs. Drag and drop the fonts into either the system or the fonts folder depending on your OS level. restart your computer. Do remember to use fonts specific to MAC and do not use multiple formats of the name named font, this can lead to troubles down the line.

Do notice that many fonts are downloaded as a zip file. This simply means you will have to open the file and extract the font to your hard drive before trying to install it. (maybe colour or bold this small section)

Once installed on your computer the new fonts should show up in all your programs like word processing, graphics and web layout. Again keep in mind with direct web layout use of type, that viewers will only see fonts they also have on their computers. Still this gives you a vast amount of creativity to make image files to intersperse on your pages with fancy fonts. Or if creating printed items your fonts choices will be seen as you choose because they be printed out.

Even after all this is the right font just not out there for you? You can also create your own! by hand and scanned in or through a graphics program you can create your own fonts. One option is to use them as individual graphics files or you can also pay to get them transformed into true type fonts.

Ultimately there is no excuse for not letting your site’s personality shine through even through the styling of the words on the page. Have fun with your fonts and let your passion shine through!

Which Font To Choose?

Choosing the font for your web page is a big step. First you must understand that the only fonts any viewer will see on your page are fonts that exists on their harddrive as well. So this brings us to two important points.

First if you want something fancy for headers, titles or buttons that is not one of the generic fonts installed on most computers create gif images and use those instead of a straight font call.

Second instead of narrowing it to a single font call you should use a font set to widen the scope of an acceptable font to be seen as the body typeface. You specify a grouping of like font families to create a font set. This means less browsers will simply kick to a default type that may not be anything like what you hoped the page would like when viewed.

A font family means all the aspects of a specific font grouped together, such as bold, italic, and normal. If you browse your type folder you can often see where the differing aspects are separate icons in that folder.

There are many types of fonts to use but for the screen sans serif are the most readable. Sans serif means without those little tails on the end of the letters. In fact Verdana is a font family that was created specifically for the web. A good generic grouping of sans serif font families would be: Verdana, Trebuchet, Lucinda Sans, Arial or Hevetica, Arial, Geneva, Lucinda Sans. When calling a set what happens is that the browser will go through the listed font families in order setting the page’s text with the first available font family that matches.

Monospaced fonts are also often used on web pages but usually for coding examples. Think typewriter styled fonts, ones that are equally spaced and sized. These would include Monaco, Courier, Courier New and Lucinda Console.

In review, when choosing your fonts keep these simple guidelines in mind.
1. Viewers will only see fonts that are already installed on their computer’s harddrive.
2. List a font set of like font families to increase the chances of you page looking like you want it to on all browsers.
3. Use sans serif for screen, serif for print only, and monospaced is good for code or typewriter effects.
4. Keep usage of cursive and fantasy(highly stylized)fonts to short sections and realize if not done as a GIf there is a good chance most computers will not see it as that cool font you thought you were using.
5. Don’t go overboard changing fonts all over the page or within sentences and paragraphs. Clean and simple web pages tend to get the viewer involved more in your pages than those which are overdone and hard to read.

Go forth play with the countless variations of fonts but do so with your viewer in mind!

Protecting Images Online

Many artists are finding it advantageous to have a web site to use as a digital portfolio. But at the same time are leery about how to post their images in order to protect their artwork and copyrights. Here are a few things you can do as an artist to protect your art images on-line.

Size, Format & Resolution

Resolution is how clear the image will be on screen and printed out. Generally even though newer monitors take the 96 dpi (dots per inch) it is still suggested that you save and use web images at 72 dpi to decrease their vulnerability to being printed out or enlarged by others.

Size is how large it will appear on the screen or printed out. Do note depending on your dpi the difference in screen size and print size can vary greatly. Generally as an artist you will want a large enough image to see the general idea of the piece but not so large as to be easily downloaded and reprinted in any quality by others. Sizing of images for the web generally is a personal choice with no hard and fast rules about sizing. That said one should keep in mind that there are still a large number of people on dial up connections due to their geographical location. So keeping sizes as well as numbers of images posted per page to something a dial up can handle is a good rule of thumb. Thumbnails are generally 50-100 pixels per largest edge and full sized images at a maximum of 400-500 pixels per largest edge.

Format is the type of file you save the image as. Generally gifs and jpgs are the preferred formats on the web. Jpgs are usually favored for straight visuals with gifs being used for images that include text. Both formats are compressed and lose information with each successive save.

Search Engines, Bots, Spiders & Indexing

The first thing you should always do on a web site is include an index.html file in the folder you are keeping images in. This will make sure that if someone ever types in http://yoursite.com/images they will never get a listing with links to each image you have stored there. You can either have the index page as a redirect page to another area or simply copy your site’s main index page into the folder. Many servers now are set up to disallow folder listings on any folder on a domain, but check and be sure.

Search engines have adapted web searches for image only, thus you need to adapt your pages accordingly. It is possible with the new searches to never have anyone visit your pages and yet they can see or download your images. Meta tags placed in the HEAD section of your web pages can help. The HEAD section is the area between the tags <head></head> toward the top of the html coding for the page. This is generally how a protected page’s code would look.

<meta content=”noindex, nofollow, NOIMAGEINDEX, NOIMAGECLICK” name=”robots” />
<meta content=”NOARCHIVE” name=”ROBOTS” />
<!–post=”15″–>

Additional meta tag information and tools can be found at http://2webhead.home.comcast.net/metatool0.htm

Another help is to have a robots.txt file in the root directory of your web site. Excluding search engines from directories with robots.txt is a standard way to protect yourself. Place all the images you want to protect in a subdirectory such as /images or /pix. Next create a plain text file called robots.txt. You cannot create robots.txt in a word processor because the formatting will cause it not work. Use a PLAIN TEXT editor, like notepad or textpad, and save as plain text, not richtext.

This example will keep most searchbots from indexing anything in the listed directories. The contents of robots.txt would look something like this:
User-agent: * # directed to all robots
Disallow: /cgi-bin/sources
Disallow: /cgi-bin/
Disallow: /sound/
Disallow: /pix/
Disallow: /images/

If you want to keep all bots out of all subdirectories, your robots.txt would look like this:
User-agent: *
Disallow: /
Not all searchbots honor robots MetaTags. Neither do all of them pay attention to robots.txt files, but almost all will obey one or the other. So by using both you can help protect the images on your site. Search engines like Google or Lycos that have Image searching, will remove the images in your protected directories from display in 4 to 12 weeks after you add a robots.txt file.

Remove cached pages/images
Google (and several other search engines) keeps copies of many of the pages it crawls available in a cache. This allows an archived, or “cached”, version of a web page to be pulled up if the original page is offline. If you want to prevent all robots from archiving content on your site, use the NOARCHIVE Meta tag. Place this tag in the section of your documents as follows:
<meta content=”NOARCHIVE” name=”ROBOTS” />
Note: You can only have one Robots MetaTag in each web page; the first example showed you how to string a number of Content values together to fine tune for the exact permissions you want the Robots to obey. Also, the NOARCHIVE tag only removes the “cached” link for the page. The search engine continues to index the page for searching

No Right Click Coding
You can put a no-right-click script on your page, but that will only protect you from people that don’t know how their computer works, or who have specific browsers. At last count, that is still a lot of people, so it would still be a helpful script.

<script language=”JavaScript”>
<!–
// No rightclick script v.2.5
// (c) 1998 GigiKnight
// Don’t delete this header!
var message=”This page was made by me! If you want something your going to have to ask for it” // Message for the alert box
// Don’t edit below!
function click(e) {
if (document.all) {
if (event.button == 2) {
alert(message);
return false;
}
}
if (document.layers) {
if (e.which == 3) {
alert(message);
return false;
}
}
}
if (document.layers) {
document.captureEvents(Event.MOUSEDOWN);
}
document.onmousedown=click;
// –> </script>

WordPress users also have quite a few plugins available to them such as ink to help protect their content both visual and written.

Watermarking
Is a process where you add extra information to the image to make it unusable by others. It can be as simple as saving the image with your url printed in bold across the middle of the image to as elegant as hidden text that only shows up when the file is saved or printed. Some methods are easier to get around then others but basically anything you do digitally to an image can be undone as well.

Using any of these simple means when posting your images will help you protect your image’s integrity and copyrights. You don’t have to fear the world wide web but you should use caution and common sense to keep you and your images safe online.

Plain text editors

Plain text editors are necessary when working with certain file types like robot.txt, CSS and .htaccess files. You cannot have any extraneous formatting in such files or it will corrupt the file and cause it not to work online.

The good news is that you can get wonderful plain text editors for free all over the web. I highly recommend EditPad Lite as an option. It does everything you need with a small footprint. (Footprint being how much room the actual file takes up on your harddrive. OS software like MAC and Windows have behemoth footprints while graphics programs are merely humongous.) It is also very simple and intuitive to use. Mac friends recommend TextEdit and Text Wrangler as good free plain text editors.

robots.txt file

Now one of the most important files you may create to protect your websites with a plain text editor is the robots.txt file.

Search engines and other parties use spiders and bots to crawl the web and index all the pages they come across in any domain. The information harvested can be used many ways. Search engines use it answer the search requests as an example.

There are good bots and bad bots. The good bots will firstly obey the robot protocols and only index those areas they are invited or allowed to. They will also most often be helpful to you in getting folks directed to your site. Now bad bots will not even query for a robot text or follow any meta tag directions specifically for bots. They are intrusive and will index areas of your site you do not want anyone getting into. Most often these are run to harvest emails listed or posted on the site or to gain knowledge of possible chinks in your site security.

To protect yourself a good start is the robots.txt file. This text file is placed at the root of your site, in other words it belongs in the same area as your index page for the site. It’s url should be http://yourdomain.com/robots.txt . It is always named robots.txt and is a plain text file that is uploaded to the site in ASCII format only. Basically This file will tell spiders and bots what areas they are or are not allowed to index on your site.

The first part of a robot text is identifying the User-agents. Here you can be specific about applying a denial or all access pass to a particular spider/bot or use the wildcard symbol * and apply the rules to all the spiders/bots who call the text as they visit your site. The asterisk * or wildcard in the User-agent field is a special value meaning “any Robot”. A good database of search engine spiders/bots can be found here: http://www.robotstxt.org/wc/active/html/ .

It is suggested if you do not know for sure whom you want to include or exclude specifically to just use the wild card character. Then do your research and use your logs to help identify good and bad bots.

The second part of a robot text file is where you set the permissions. Here you can give an all access pass, deny access to the whole site or just protect certain sections of the site, either folders or individual pages. Do remember though if you only disallow access for a specific page, the rest of the folder is open and accessible to the bots.

Use a plain text editor, no formatting!!! to create your robots.txt file. Do make sure to save it as robots.txt and to upload it in ASCII format to your root directory on the server.

Some examples:
Here nothing is disallowed and bots can follow all links:

User-agent: *
Disallow:

Bans all bots and prevents your entire site from being indexed:

User-agent: *
Disallow: /

Bans all robots from specified sections of the site:

User-agent: *
Disallow: /private/
Disallow: /images/
Disallow: /cgi-bin/
Disallow: /misc/topsecret.htm

Bans a single robot and allows no access at all:

User-agent: slurp.so/
Disallow: /

Bans a single robot from specified sections of the site:

User-agent: slurp.so/
Disallow: /private/
Disallow: /images/
Disallow: /cgi-bin/
Disallow: /misc/topsecret.htm

Allows a single robot complete access and bans all others:

User-agent: Googlebot/1.0
Disallow:

User-agent: *
Disallow: /

There does need to be a spacer line between each separate User-agent grouped code, like the example above.

It should be mentioned that you may find a lot of 404 errors in your web stats if you do not have a robots.txt file. This is due to all the good bots who query for it before indexing the site, and not finding it because it is missing.

The reasons for excluding files from some or all spiders could be privacy, log files or pages optimized for a particular search engines which you would not want indexed by other search engines. If you have something you definitely do want seen on the site it should be either be in a private folder that is disallowed through the robots.txt or password protected or uses hta access to dissuade bots from indexing it or best of all simply not put onto the web at all.

The robots.txt is just one facet of site security and is best used in coordination with robot meta tags as well as hta access files. But even alone it can help protect your site and your server load through directing bots where they can or cannot go.

Creating Favicons

So just how do we create these favicon images? It’s not too difficult if you keep the idea of simplicity in mind. These are small images so you want something that is clean and simple to be easily recognizable at those small scales.

*Special note to you Adobe CS users Go-Live automatically will create favicon images using graphics from your web page as will Dreamweaver.* Otherwise here is a simple way to create favicons in Adobe Photoshop. Other graphics programs can be used similarly, just the exact menu calls might differ.

First the main size is 16 x 16 pixels .. so create a new image and size it to that. LOOK at how tiny this is, you need to think minimal to create a good and worthwhile favicon. Now try creating a new image sizing it at 64 x 64 pixels. This is usually the recommended size to play with designs for favicons in. Just keep in mind that the large 32 pixel square image will be half this size and the standard 16 pixel square image will be one fourth of this size.

Try working with images from your logo or index page within the 64 pixel square size. When you come up with something you like select that image file, click on Image and choose duplicate. Do this twice. Choose one of the duplicate images and click on Image again and choose image size. Here change the size to 16 x 16 pixels. Try changing the other duplicate to 32 x 32 pixels. Compare the results, are you happy with the image at both sizes? Is it recognizable at both sizes?

Sometimes just paring the image down to the essentials will help here. Even then a trick for the smaller sizes can be to simply up the contrast on the image. To do this select the image, than choose image, click on Adjust and choose brightness/contrast. The pop up window allows you to play with the contrast and going more toward the + side can sometimes be just the thing to bring your miniature image into better focus.

Now if you are saving the image for use as a gif version make sure you are at 16 x 16 pixels and you are using 256 colours. If you are saving to use as a png version again 16 x 16 pixels and you can use either the 256 colour or 24 bit setting. You can check these settings by clicking on the image file choosing Image than Mode.

Now if you wish to make a windows icon file version (favicon.ico) there are two ways to go about this. Either do a web search and find one of the many on-line utilities where you upload your graphic image and then it coverts it to an ico file you download. Or you can try http://www.telegraphics.com.au/sw/ and download a plugin which will allow Photoshop to open, edit and save ico files. There are both MAC and Windows versions available for free download.

Once you have created your image make sure to have the 16 x 16 pixel icon saved as favicon.ico or favicon.gif or favicon.png. You can also add the larger 32 x 32 file by naming it favicon32.ico or favicon32.gif or favicon32.png. Upload the icon file(s) to the root folder of your site; meaning the same place where you have your main index.html file. DO NOT place it into a graphics or images folder!!!

That’s it! So why not add a little personalization to your sites that will appear to everyone browsing your page? Or even just brighten up your own desktop a little with your own custom icons? Happy icon creating!

« Newer Posts - Older Posts »