I recently built and launched my first two sites on force.com, this one and qandor.com. Overall it was a very good experience and force.com sites functionality worked the way I had understood it to. However, there were a few of the finer points that I wish someone had told me before I started… so I’ve put together 5 little bits of knowledge that might help you out as you endeavor to build a delicious site of your own.
1. Cache flow
I don’t know about you, but when it comes to web development, I’m big on “trial and error”. Meaning that every little change I make to the elements or CSS or whatever I go and refresh the page in a browser so I can see the results… change… refresh… assess… repeat. So the problem I ran into with Sites was that I would make a change to a page, or maybe to the data in an object which is being displayed on a page and it would take up to several minutes before the change would show up on my site. At first I thought it was the browser, so I would try clearing my cache and refreshing… no dice.
Luckily I struggled with this issue just before I went to Dreamforce so I brought the question to the salesforce campground where I found a very simple fix that made me feel dumb for having overlooked it. There is a property on the Visualforce “apex:page” component called “cache”. You can set it to “false” to tell the salesforce servers that they should not cache the page.
Doing so will lead to increased bandwidth usage so they advise only setting to “false” during development or when truly necessary. Also, there is another property “expires” which is closely related. Rather than have me tell you about it, see the following help article (salesforce login required): Caching Sites Pages
2. Custom domain pain
Custom domains are vital if you want to have a polished website running on Force.com. You can have a web address something like www.mycoolsite.com instead of mycoolsite.force.com. If you really want to get acquainted read this recipe. Really cool stuff!
Ok, so I registered the domain I wanted “michaelforce.org”… ok the one I really wanted “michaelforce.com” was taken by some weird self-help guy or something… so I got “.org”, whatever, I’m not bitter. Then I followed the instructions in that recipe. The problem then came up: if a user typed in “michaelforce.org” and hit go (without the “www.”) then it wouldn’t go to the correct place. This is because you can only setup CNAME files for “www.mycoolsite.com” or other subdomains like “support.mycoolsite.com” but if someone just types “mycoolsite.com” then the DNS will send them to the IP address defined in the A (host) entry. So, I’ll just edit the A record IP address to be that of my force.com site… right? No, force.com sites don’t have static IP addresses. Bummer.
So the solution I had to go to was to find a web server where I could get a static IP address, and have this IP address redirect to my force.com site…. then go point the A record on the DNS to point to that IP address. One main reason for using force.com sites is so you don’t have to manage a server! Luckily a friend of mine was already running one and hooked me up with an IP. I would totally give him a shoutout, but will refrain because I don’t want the whole world asking him to do the same thing for them! (assuming anyone reads this thing… haha)
Anyhow, here’s a caveman drawing of the dodgy solution:
3. Do the Robot!
I’m not talking about the robots @metadaddy builds with his son, or the robot uncle hank does at weddings when he’s had too much maker’s mark. I’m talking about a little text file, robots.txt, that websites expose to declare which parts of their site should or should not be crawled by web spiders and search engines. It’s extremely easy to create one for your Force.com site, you just create a visualforce page and then edit your Force.com site properties and use the “lookup” on the “Robots.txt” field (naturally).
I’m not going to go into detail on exactly how you go about including or excluding specific parts of your website for specific engines… you’re on your own there. I just wanted to give a heads up that if you DON’T define a robots file, then you can’t have google ads on your site. You see those glorious, sexy ads in the right column over there? They weren’t showing up at all until I added the following file to my site. It’s as basic as you can get… it declares that no pages or engines are off limits, that’s it.
4. New Field outta site?
This tip is pretty simple: after you have your site established, and you add a new field to an object… the field is going to be hidden from the “guest” profile by default. This is a good thing… helps you from accidentally displaying a new field to the world. But during development you should keep this in mind otherwise you’ll create your new field and scratch your head for a while wondering why you can’t see it on the site. You’d figure it out eventually I’m sure, but now I’ve saved you the trouble ;-).
5. Web Services have to repeat third grade
Ok, so to be honest, this issue isn’t just related to sites… its more of a general apex issue that I ran into when I was building my site. The issue being the difficulty involved with writing apex tests to cover classes that call web services.
In michaelforce.org I built in calls to the Twitter API, reCAPTCHA, and Bit.ly. All three were a huge pain in the ass to write test methods for. The basic reason for that, is that test methods can’t make callouts! So you have to find ways for your code to run everything “but” the callouts in your apex. I ended up using the trusty, popular method of a custom “isTest” property in my apex, so that when testing is underway the code will skip the actual callout and instead jam in there a response that is just like a normal response would be from the web service I was trying to call.
Anyhow, rather than me babbling on about my ugly hacks, let me share some links to the resources that helped me get my arms around the topic:
Please leave a comment if you have a lesson of your own to share, or want to add more detail to one of my points. Hope this helps. Cheers!