Stat Tracker

Thursday, December 23, 2010

Leveraging Scheduled Apex to link Batch Apex Jobs

Winter 2014 NOTE: This pattern is deprecated. Salesforce.com now support invoking the Database.executeBatch() in the finish method of a Batch Apex class. Therefore this pattern is no longer needed! 

When I have a batch Apex job finish, how do I execute a second batch Apex job? Its a question that I see frequently in the developer boards at developer.force.com and for good reason. Many enterprise developers who have moved over to Force.com have worked with batch schedulers in some form or another (CNTRL-M, Mainframe JCL, ETC).

But batch Apex does not allow you to directly invoke batch Apex to start another job due to governor limits. How can we ensure that a second batch process only executes after the first batch process completes successfully? There is more than one way to solve this problem. For this case I am going to show you how to resolve this using Apex scheduling. The pattern looks like the following:
  1. Create batch Apex 1.
  2. Create batch Apex 2.
  3. Create schedulable Apex.
  4. Inside finish() method of batch Apex 1,  invoke the schedule Apex with an execution time of now().
  5. Inside the schedule Apex, execute batch Apex 2.
This pattern will allow you to execute batches sequentially, and only after successful completion of the first batch. The pattern can be repeated to chain multiple jobs together.

Now to share some quick sample code. This code below doesn't do anything valuable other than show this pattern.

BATCH APEX 1 - The first batch Apex process.


global class AccountBatch1 implements Database.Batchable<sobject>
{
   global final String Query;
   global final String Entity;
   global final String Field;
   global final String Value;


   global AccountBatch1(String q)
   {
             Query=q;
   }
   global Database.QueryLocator start(Database.BatchableContext BC)
   {
      return Database.getQueryLocator(query);
   }
   global void execute(Database.BatchableContext BC,
                       List<sObject> scope)
   {
         List<Account> updateAccts = new List<Account>();
      for(Sobject s : scope)
      {
          Account a = (Account) s;
          a.Name = a.Name + 'Batch 1.';
      }     
      update updateAccts;
   }
   //The batch process has completed successfully. Schedule next batch.   
   global void finish(Database.BatchableContext BC)
   {
        System.debug(LoggingLevel.WARN,'Batch Process 1 Finished');
        //Build the system time of now + 20 seconds to schedule the batch apex.
        Datetime sysTime = System.now();
        sysTime = sysTime.addSeconds(20);
        String chron_exp = '' + sysTime.second() + ' ' + sysTime.minute() + ' ' + sysTime.hour() + ' ' + sysTime.day() + ' ' + sysTime.month() + ' ? ' + sysTime.year();
        system.debug(chron_exp);
        AccountBatch2Scheduler acctBatch2Sched = new AccountBatch2Scheduler();
        //Schedule the next job, and give it the system time so name is unique
        System.schedule('acctBatch2Job' + sysTime.getTime(),chron_exp,acctBatch2Sched);
   }
}

BATCH APEX 2 - The second batch Apex process.


global class AccountBatch2 implements Database.Batchable<sobject>
{
   global final String Query;
   global final String Entity;
   global final String Field;
   global final String Value;
   global AccountBatch2(String q)
   {
             Query=q;
   }
   global Database.QueryLocator start(Database.BatchableContext BC)
   {
      return Database.getQueryLocator(query);
   }
   global void execute(Database.BatchableContext BC,List<sObject> scope)
   {
         List<Account> updateAccts = new List<Account>();
      for(Sobject s : scope)
      {
          Account a = (Account) s;
          a.Name = a.Name + 'Batch 1.';
      }     
      update updateAccts;
   }


   global void finish(Database.BatchableContext BC)
   {
        System.debug(LoggingLevel.WARN,'Batch Process 2 Finished');
   }
}

SCHEDULE APEX - Just execute the next batch Apex.

global class AccountBatch2Scheduler implements Schedulable
{   global void execute(SchedulableContext ctx)
   {
        AccountBatch2 acctb2 = new AccountBatch2('Select Id, Name from Account');
        ID batchprocessid = Database.executeBatch(acctb2,20);
   }  
}

Leveraging these classes, you can continually execute this batch sequence. I have included some of the monitoring logs to show the output of this process.

Here you can see the batch Jobs were executed:



Here you can see the scheduled Apex for executing the second batch:

This is not the only solution to the problem, but it is one that I personally prefer. You can also use Salesforce Email Services if you so desire, however I prefer this approach.

I have an idea posted on the Salesforce Idea Exchange to allow Batch Apex to call another Batch Apex directly from the finish() method. You can vote for it here: Batch Apex invoke from Finish Method of another Batch Apex.

Thursday, December 16, 2010

Organization of Salesfore Apex Code

I have noticed that the more I use Apex, and the more I come across other folks Apex code, that coding standards are really slacking. Sure there will always be folks who write SOQL code and DML method calls within for loops forcing us to bulkify that code, but the real problem is developing Apex code in a manner that is easy to understand and maintan for other developers.

Far too often I find business code slammed inside triggers, breaking sharing rules and just being downright hard to maintain and understand. Add to that some weird naming conventions on Apex classes and I get lost! Oh man, and I shouldn't even mention unit test methods bunched together into one giant test method, and/or a bunch of unit tests bunched into one class that aren't labeled clearly.

Pile on top of that the fact that Apex doesn't allow packaging (I.E. com.client.apex.data.xxxxxxx) so your left with a flat structure for all your code. How can one make this easy to understand and maintain? My brain is frazzled!

I have learned a few things and approaches I use so I thought I'd share. I'd love to hear what other ideas folks have for structuring their Apex code. These are just some guidelines I've used and they've worked on several projects to keep the organization clean.

1. Only have 1 trigger per object, and name it "ObjectTrigger". Inside the trigger, delegate calls to a "ObjectTriggerHandler" class based on the trigger context variables (trigger.isInsert for example).

2. For each Trigger, create a test class and call it "ObjectTriggerTest". Don't bunch up a bunch of different trigger tests into one huge test class and call it "TriggerTests". That makes it hard for another developer to quickly break down where the tests for a trigger are located.

3. Create multiple test methods inside your trigger tests. Don't lump up a bunch of test conditions into one huge test method. Clearly break out your test methods to test specific unit test cases and name them meaningfully. Then when another developer runs your tests, and one fails, they can see what exactly failed without running through a ton of code.

4. Break out your SOQL calls into data access objects and name them "ObjectDAO". This allows you to reuse SOQL calls in different classes.

5. Create a constants class and call it something like "ClientConstants". This allows reusue of constants and is easy for someone to look at and maintain those values.

6. Batch and schedule apex you can name "ObjectBatch" and "ObjectScheduler", with tests names "ObjectBatchTest" and "ObjectScheduler". Perhaps you see a patter to my naming evolving here?

If you follow the structure I've provided above, when another person comes into your org to look at your org they will see Apex code lined up cleanly and they will see the related objects grouped together (Triggers,Tests,Batch,Scheduled will all be next to each other for their corresponding objects).

Happy coding folks! And let me know if you have other ideas on structuring your orgs code!

Tuesday, December 14, 2010

Force.com Labs Released Project Milestones on App Exchange!

Force.com Labs recently released the Project Milestones application. If your looking for a lightweight project management tool for your SFDC org, it's pretty handy. You can create Projects, milestones, tasks, expenses, and time entries. It has support to create XML exports/Imports of your projects, as well as task creation via email services out of the box. It also leverages Chatter and can auto-follow tasks for assigned users which is neat.

I used a preview / alpha build on a small Project recently and it worked well. It's not meant to replace something like MS Project or Primevera Teamplay. But for a free lightweight app its a nice pickup.

Sunday, December 12, 2010

Chicago RCN - DNS Server Outage 1 week after Comcast

Less than one week after the Comcast DNS server issues led to an outage in the midwest for Comcast customers, RCN is experiencing the same type of outage in Chicago. As of 10:15AM this morning, all of a sudden all my devices lost internet connectivity. Just when I was getting some work done before the Bears game!

I called Tech Support and the phone system immediately notifies Chicago customers there is an outage so I just hung up. No sense in trying to get account credit for an outage that may only amount to a few dollars.

Instead, on my work laptop (Which is Windows 7), I switched to using OpenDNS and everything started working again.

To change your DNS Server Settings in Windows 7:

1. Open "Network and Sharing Center". Click Start -> Control Panel -> Network and Sharing Center.

2. Click Network Adapter Settings in the left side of the window.
3. On your network connection, Right Click -> Properites

4. Click on Internet Protocol (v4). Click Properties button.

5. Click on DNS Server radio button and fill in the OpenDNS Server values. Primary Server: 208.67.222.222 and Alternative Server: 208.67.220.220. You can also switch to use the Google Public DNS Servers by setting Primary Server: 8.8.8.8 and Alternative Server: 8.8.4.4.

In the long run, I think I'll just wait till the DNS service is restored for my other devices instead of configuring my router or other mac devices to use OpenDNS. But, if your internet stops working and you have a connection to your ISP, its worth noting that you can try and change your DNS settings to use another DNS Server to restore service in a pinch.

For another great tutorial on how to do this aside from what I just showed, checkout: http://www.sevenforums.com/tutorials/15037-dns-addressing-how-change-windows-7-a.html

Thursday, December 9, 2010

Salesforce Winter 10 - Attachment Trigger Bug (before Insert, after Insert)

This past week I discovered a very annoying bug with Attachment Triggers in Salesforce. In Salesforce Winter 10 release we can now assign triggers to the Attachment object in Salesforce. Note that you can't do this via the browser setup, you'll need to be inside the Force.com IDE.  The problem is that the triggers for insert won't always fire from the browser user interface!

For the project I'm working on I had to build a trigger for the Attachment object in Salesforce. I built the trigger, the trigger handler class, and the unit test class. Everything looked perfect. Unit tests were passing with 100% and all the system.assertEquals statements were showing the business logic was working for all the DML statements (insert, update, delete, etc). So I was feeling good about getting a decent amount of work done fairly rapidly, and thats when I hit the wall.

I attempted to do some spot checking on the Attachment functionality via the browser the way a standard user would attach a document. Then I notice that the fields have incorrect values, almost as if the trigger didn't fire! I open up the good old system log window, set the debug options, and tried again. I didn't see the Attachment trigger firing at all? What the heck? It worked in my unit tests during inserts, why didn't it fire? Its a DML / trigger, it shouldn't matter where you invoke it because at database level it will fire.

Fast forward to 3 days after opening a premier support ticket that had to be escalated three times to the R&D group at Salesforce. They inform me that its a bug in the system, and they are working to resolve it in the future.

So as a workaround I had to build out some Batch and Scheduled Apex to do the processing. The client was OK with this solution, as the data can be stale within a day. But very frustrating! Salesforce is usually pretty good at releases and bug fixes, so hopefully this will be fixed in the near future.

So my advice is to avoid writing Attachment triggers in Salesforce until April 10 release (hopefully).

Monday, December 6, 2010

Salesforce Schema Manager Singleton Pattern

I've been fortunate enough to work on some pretty cool Salesforce projects. One requirement that comes up a lot is to make generic utilities which can run over any set of SObjects. For example, performing things like displaying all the available fields in a Visualforce Page or doing some introspection on objects.

As of Winter 11, they have increased the SObject describe limits from 10 to 100 which is nice. However, this may still not be enough if your making duplicate Describe calls for the same object. However, if you use a Singleton pattern to only retrieve the information one time during a session, you can help reduce making duplicate describe calls, saving you precious describe limits! Below is a simle example where I am caching all the SObject Global Describe into a Map, so that I can quickly access any SObject Type based on the API name of the custom object.

---------------------------------------------------------------

//This class will do all the methods to retrieve schema infomraiton on SObject for apex
//This will ensure that multiple calls to descibes aren't called so we don't hit gov limits
public with sharing class SchemaManager
{
    private static Map<String, Schema.SObjectType> sobjectSchemaMap;
   
    //Retrieve the specific Schema.SobjectType for a object so we can inspect it
    public static Schema.SObjectType getObjectSchema(String objectAPIName)
    {
        if(sobjectSchemaMap == null)
        {
            sobjectSchemaMap = Schema.getGlobalDescribe();
        }
        Schema.SObjectType aSObjectType = sobjectSchemaMap.get(objectAPIName);
        return aSobjectType;
    }
}


I have used this pattern many times, and have expanded on it for caching field metadata describes as well. I didn't post the larger Schema Manager here, but you get the idea.