Platform, Platform, Platform

If we were attending a convention focused on Real Estate, we would hear the mantra "Location, Location, Location" but since we are at Dreamforce 2010, we are hearing "Platform, Platform, Platform!" Of course, not just platform, but Cloud Platform. This is the focus and the hype of Dreamforce 2010 and all of 2011. Thirty thousand attendees showed for the latest news from Salesforce. The anticipation is over, the announcements are made and now it is time to watch and see how it plays out. is not new to innovation or big announcements and this year is no different. Focused on the developer, adding and Heroku to the Salesforce arsenal is a great move.  This instantly broadened their offerings to developers, giving new and existing Salesforce customers the ability to expand beyond Apex and write applications in Java or Ruby.  I read one blog that called it the "wedding of the year"! 2011 will provide some interesting insight into this merger and what becomes of it. The possibilities of new products and new capabilities are intriguing.


Sandbox Preview Window 

Dreamforce is over and SFDC is not missing a beat as the Sandbox Preview Window for Spring ’11 release is around the corner.

  • Scheduled dates are January 21, 2011-February 12, 2011
  • blogs are ready to answer your questions with an FAQ and tables to assist you in determining your plan of action if you are looking at uploading or installing AppExchange packages using your Sandboxes during the Preview Window.

Once again, Salesforce is hard at work putting the latest in functionality at developers fingertips.  Sandbox customers have the privilege of early access to exciting new features and customizations.  There are several blog posts that share detailed instructions, FAQs, more information on the upcoming release.  Do take a look at the details on refreshing your sandbox to ensure success as you  experiment with the update.


DupeCatcher is a Must Have

I've posted about DupeCatcher recently on my Twitter account, but I think it is worth mentioning again here. I've been a part of the beta for this new Salesforce utility and I love it. DupeCatcher is a free Salesforce lead, contact, account duplicate blocker. It blocks or flags the leads that are being typed in manually through Salesforce. There are other tools like this, but this one is free. If you haven't tried it out, go to their website and fill in the evaluation form. They have not updated the site yet to say that it is in fact free but I am told they will in the next week or so.

I have a feeling we'll be hearing more about this company in the future here.



Preparing for Salesforce Security Review 

(Part 1 of 2)

Getting ready for the security review can be a little nerveracking. I've been through this process before so I want to share with you how I prepared for the security review and how the process goes. One of the best and worst things about development is how much documentation there is. This is fantastic for a Salesforce developer but it is also a struggle because you feel like you could be missing something. In addition, you might have to bounce around to three different documents before you find what you are looking for. This is all part of the fun though. When you find the answer you were looking for it's a great feeling!

When you start to plan for your Salesforce security review there is a lot of information to parse through. What I want to do here is provide some of the highlights to get your moving in the right direction.

Step 1- The very first thing you want to do when preparing for the Salesforce security review is to go through the Requirements Checklist in detail. Much of this is very standard web application development best practices. There are three areas though that I paid special attention to. They were A) making sure that all my triggers were bulkified,  B) my unit tests were up to par, C) protecting against SOQL injection. I mention the first two because the trigger aspect was something I wasn't as familiar with and I found gaps in my unit testing (unit testing best practices). They recommend that at least 75% of your code is covered in unit tests but I would shoot for higher than that. In your unit tests, make sure to use "System.asert" methods as much as possible - you want to prove that your code is working properly. I've changed my development style from when I started on to be more test driven.  

Step 2 - Take the time and add proper comments to your code. I don't know how much this plays into the security review but it is the best way to develop. As a general rule, I write the comments as if I have to hand this code over permanently to my friend and they will be maintaining it moving forward. The reason for this is sometimes programmers write code knowing they understand how it works and if someone has questions they can just ask.

I work under the assumption that I may be busy with another project and I don't have time to explain everything but I also don't want that person to struggle. I picture my best friend having to pick up where I left off and I don't want him stressed out trying to figure it out. The nice thing to do is write enough comments so they can smile when they look at the code and say, "cool, this guy did a great job of explaining what all of this does." It's a bit of a mental exercise but it seems to work well for me.

Step 3 - Go through the OWASP Top Ten ChecklistOWASP is a valuable resource to keep you up with the latest web application threats you need to guard against. Make sure you go through and check your code for all of these. In my case I needed to pay special attention to the injection and cross site scripting.

Step 4 - Take advantage of the free code scanners . I can't tell you how great these resources are.

Self-service source code analyzer (free) : scans your Apex and Visualforce and produces a nice report. It's important to know that it can take anywhere from 1 hour to a full day to get this back. I try to kick it off toward the end of the day knowing I've added a bunch of new code and made fixes. 

Web-application scan (free) : scans any web-servers you have that integrate with


Each time you go back and make changes to your code, you will want to check to make sure your unit tests still cover all of your code and that you have followed all the best practices. Each time I make changes I kick off the source code analyzer.


Maintaining Data Quality in

Salesforce data quality is an important topic. One of the most frustrating things that can happen to a business is needing data from Salesforce only to realize that all data up to point X is useless. Let me give you a simple example from personal experience. Years ago, I was working with a sales team to help them pinpoint customers that were likely to cancel before they actually did so. They needed to find ways to maximize their time and resources by proactively following-up with accounts that statistically fit the profile of an account that was about to cancel. The first place I started was with the data.

As any database administrator knows, the data is only as good as the people who are entering it. The old adage "garbage-in-garbage-out" comes to mind. In my case, I was really excited to find out that the accounting department had been giving cancelling customers a brief survey before they closed out the account. I went into the database and started to run some queries. What I quickly found was that the number of questions answered on each survey depended largely on which account rep was closing the account. Some either never asked the questions or didn't take the time to fill in some of the key details. In the end, I was able to get the data but I had to pull it in from 5 different sources and even then I wasn't confident in what I had.

This is a classic example of when data is needed to make good business decision and the data just plain doesn't exist or is in such poor quality it cannot be counted on. While there is not much you can do about it after the fact, the lesson to take away is that NOW is the time to collect good quality data. Companies waste countless marketing dollars each year pursuing bad prospects. With a little bit of effort up front, these problems can be shelved.

As a system administrator, part of your job is to realize that your priorities are not going to sync up with those of the end user many times. Making a field required is a single click for you, whereas it could mean as extra 15-20 per call for the CSR who uses that form 200 times a day (66 extra minutes of work). To do your job effectively you need to walk that line of what is best for the user and what is best for the company. In my opinion you always lean toward the company but communicate why this is so important to the user. When you lay out a business case for why something needs to be done it helps. In addition, it helps a lot to listen to the concerns of the end user and do what you can to make them feel a part of the decision making process.

So, what are some ways to ensure data quality within

Design and implement a data policy that ensures your users fill out all fields in an accurate manner. In addition, the users should be consistent. Accuracy combined with consistency creates data integrity and reliability. A good data policy should also include standardization. This means creating naming conventions that are instilled in all users. In addition, these data standards must be enforced in by validating the data as it is typed in. As an admin, you'll want to take away as many variables as possible. Create pick-lists in your forms whenever possible. Also, make sure that you know where your data is coming from. If someone is getting ready to do a mass import, make sure to analyse and cleanse the data before it gets imported into Here are some simple examples of standardization:

  • Country/State: use validation to standardize AZ vs Arizona, USA vs U.S.
  • Account names: Inc vs Incorp., Ltd vs LTD, Limited

Track data quality. Tracking data quality on an on-going basis is very important. In order to know that you are making progress and maintaining the data standards you've outlined, it is important to get some baseline metrics. Here are some things I recommend you start to look at (feel free to add your own):

  • Prospect Accounts Missing # of Employees (last 60 days)
  • Opportunities with Close Date (last 60 days)
  • Lead Rating on Converted Leads (monthly %). If 90% of the converted leads are classified as cold, then what are hot leads?
  • Lead Source (last 60 days)
  • Leads with a status of "Qualified" that have not been converted (monthly %)
  • Contacts without valid email addresses
  • Accounts without Industry specified (last 60 days)

Base-64 Encoding in Apex

The other day someone came to me and asked me how to Base64 encode some parameters in an Apex class. Base64 is often used when you need to encode binary data into characters. Base64 is a good way of taking binary data and turning it into text so that it can easily be transmitted in things like HTML form data and email. makes it pretty easy to perform Base-64 encoding in Apex via their EncodingUtil class. Below is an Apex code snippet with a very simple example of the base-64 encode/decode.

    string before = 'Testing base 64 encode';
    // create a blob from our parameter value before we send it as part of the url
    Blob beforeblob = Blob.valueOf(before);
    // base64 encode the blob that contains our url param value
    string paramvalue = EncodingUtil.base64Encode(beforeblob);
    // print out the encoded value to the debug log so we can see it before/after base64 encode
    System.debug(before + ' is now encoded as: ' + paramvalue);
    // take the base64 encoded parameter and create base64 decoded Blob from it
    Blob afterblob = EncodingUtil.base64Decode(paramvalue);
    // Convert the blob back to a string and print it in the debug log
    System.debug(paramvalue + 'is now decoded as: ' + afterblob.toString());

You can refer to EncodingUtil documentation here for more information. In addition, you can take a look at the documentation for the primiatve data type 'blob' here.


Sep222010 Unveils Chatter 2

In less than a year, has gone from almost no social media type functionality to a fully loaded social platform. Chatter 2 brings more new tools to the table in hopes that more users will begin to use it. The current adoption rates are incredible though, so for most of us - this is all bonus functionality. Here is a list of some of the new features to take advantage of:

  • Files - you can now drag-and-drop files from your desktop directly into Chatter. In my view, this is a big one. I can't tell you how much time this will save.
  • Email digest - great for those of us who don't have time to sit and watch the latest posts and feeds. You can now setup daily or weekly recaps.
  • Topics - just like the has tags we're all familiar with in Twitter.
  • Chatter filters
  • Analytics - as a system admin this is great. I can now quickly generate a report and email it off to the department heads.
  • Desktop integration through Adobe Air. This allows you to see the latest feed without keeping a browser window open. Screen realestate is a precious commodity - this a nice add-on.
  • Search, recommendations, and more...



Regular Expressions in Apex Code

Every once in a while it's very helpful to use regular expressions in your code. The problem is that resources and examples are often scattered. I wanted to give two simple examples of how you can write and test regular expressions in your Apex code inside

Return all the numeric characters in a string via regular expression

For the sake of quickly writing and testing this, I've just created a new Lead trigger that runs before insert. I've then set a test input string "input" and I'm using a Pattern object to return the matches in a new string "test". I then print a system debug statement so that I can look in the debug logs to see my results.

trigger myLeadTrigger on Lead (before insert) {

    string input = 'a3f45qq456';

    for (Lead l :
        //instantiate new Pattern object        
        Pattern p = Pattern.compile('[^0-9]');

        //return a string that contains only numeric values from 0-9 from my original input
        String test = p.matcher(input).replaceAll('');

        //print a debug statement with my test results



Ouput from the debug log shows us that this is working:


As you can see, "345456" are the alpha chars from the original string "a3f45qq456".


Return all the alphabetical characters in a string via regular expression

For the sake of quickly writing and testing this, I've just created a new Lead trigger that runs before insert. I've then set a test input string "input" and I'm using a Pattern object to return the matches in a new string "test". I then print a system debug statement so that I can look in the debug logs to see my results.


trigger myLeadTrigger on Lead (before insert) {

    string input = 'a3f45qq456';

    for (Lead l :
        //instantiate new Pattern object        
        Pattern p = Pattern.compile('[^a-zA-Z]');

        //return a string that contains only numeric values from 0-9 from my original input
        String test = p.matcher(input).replaceAll('');

        //print a debug statement with my test results




Ouput from the debug log shows us that this is working:


As you can see, "afqq" are the alpha chars from the original string "a3f45qq456".




Winter '10 Release 

Code Scheduler & Batch Code Processor

I'm just now getting the chance to sit down and read the release notes for the Winter '10 release. There are some cool things going in from a developer standpoint (not to mention as an end user). Code Scheduler

The code scheduler being introduced is awesome. It's a cron-like mechanism that allows you to configure when your processes run so that you don't have to kick them off manually. It sounds simple, but this is a really nice thing to have. You can no create execution schedules and kick off Apex code. The code scheduler also allows you to monitor/edit schedules programmatically (as well as through the UI). Batch Code Processor

With the batch code processor, you can now batch/deploy asynchronous processes. This allows you to perform operations on an entire set of data within a single batch process. I can think of about 101 uses for this. The example they give in the release notes is perfect - building a process to validate all your account addresses and have it run in the background.

Read the full Winter '10 release notes here.