Stat Tracker

Saturday, May 30, 2015

Computer Science 101 in Apex - Bubble Sort

Bubble Sort in Apex

The bubble sort algorithm is one of the first sorting algorithms that computer science students learn. It can be thought of as the "Hello World" of sorting algorithms. It is very easy to implement the code, and the algorithm itself is fairly straightforward. The trade off of is that for large data sets it has the worst performance of all the sorting algorithms.

In a bubble sort, an array of values is iterated over one element at at time. The current element is then compared to the next element in the collection. If the current element is greater than the element that is next in the collection, the two values positions are swapped in the collection. The algorithm will iterate over the collection multiple times until all the elements in the array are ordered. In each pass the highest value element is moved to its position in the array, hence the term "bubble sort". It's like the value has "bubbled" its way to the correct position in each iteration.

If you want more details on bubble sorting algorithm take a look at the Wikipedia entry here: http://en.wikipedia.org/wiki/Bubble_sort

Now to the fun stuff. The following code will perform a bubble sort in Apex, which runs on the Salesforce.com PaaS.



Here are some logs showing it working as expected.



How useful is this in real world application of the Salesforce platform? Not very. It is much easier to use the optimized sorting algorithms on the Apex collections objects themselves (List.sort()) for example. Or if you ware working with a SOQL query simply using ORDER BY on the query will sort for you.

However it is a fun little piece of code and illustrates that you can teach core computer science principals on the Salesforce platform. Enjoy!

Monday, December 29, 2014

Create a Real Life Dashboard with CloudBit and Salesforce.com

This is the second post where I am working with the CloudBit and Salesforce.com. In my first article I made a very simple LED light up when an Opportunity in Salesforce.com moved to a stage of "Closed  Won". The second time around I decided to get a little bit more creative. And what is cooler than recreating your very own Salesforce.com Dashboards in real life with some cardboard, markers, and some littleBits!


littleBits Components

1. USB Power Module
2. CloudBit Module
3. Servo Module

Salesforce.com Components

1. Salesforce.com Report & Dashboards
2. Salesforce.com Analytics API
3. Apex Class to invoke API and littleBits Cloud API


Step 1 - Create a Salesforce.com Summary Report & Dashboard

For this example I created a Salesforce.com Summary Report. The report summarizes the total amount (value) of Opportunities that have a Stage of "Closed Won".


Once you create the report you can create the dashboard with the report as the data source. Create a gauge chart in your dashboard and set the minimum amount to 2 million and the maximum amount to 5 million. The gauge chart maps well to the Servo littleBit. 

Step 2 - Map the littleBit Servo Arc to the Gauge Value Range

The littleBit Servo has two basic modes: Step and Turn. In turn mode you can move the Servo to a specific degree based on the voltage you pass to the Servo. You can set the voltage from 0 to 100. The littleBit Servo has an arc of 145 degrees. So you will lose some precision as you can only set integer voltage values. 

To give a rough voltage value you can take the range of values in your gauge (5 million max - 2 million min - 3 million range) / 145 degrees = 20689 dollars per degree in the chart. The amount of voltage is 1% of voltage = 1.45 degrees (145 degrees / 100% voltage = 1.45 volts percentage per degree).


Step 3 - Call the Salesforce Analytics API to get the Gauge Value

Call the Salesforce.com Analytics API to get the values from the Opportunity Report. You can use the following Apex Class which will call the Salesforce.com API and convert the values into the appropriate voltage for the littleBit API.

public class LittleBitsReportManager 
{
public static LittleBitsManager.LittleBitsOutputPayload runOpptyGaugeReport()
    {
        // Get the report ID
        List <Report> reportList = [SELECT Id,DeveloperName FROM Report where 
            DeveloperName = 'Opportunities_Won'];
        String reportId = (String)reportList.get(0).get('Id');
        
        // Run the report
        Reports.ReportResults results = Reports.ReportManager.runReport(reportId, true);
        System.debug('Synchronous results: ' + results);
        // Run a report synchronously
        
        // Get the first down-grouping in the report, in this case the StageName = 'Closed Won' on Oppty's
        Reports.Dimension dim = results.getGroupingsDown();
        Reports.GroupingValue groupingVal = dim.getGroupings()[0];
        
        // Construct a fact map key, using the grouping key value
        String factMapKey = groupingVal.getKey() + '!T';
        
        // Get the fact map from the report results
        Reports.ReportFactWithDetails factDetails =
            (Reports.ReportFactWithDetails)results.getFactMap().get(factMapKey);
        
        // Get the first summary amount from the fact map
        Reports.SummaryValue sumVal = factDetails.getAggregates()[0];
        System.debug('Summary Label: ' + sumVal.getLabel());
        System.debug('Summary Value: ' + sumVal.getValue());
        
        //Set the output voltage as a percent. More voltage will send the Servo farther down the gauge.
        //Guage Chart = 0% Voltage = 2 Million (Minum Amount on Gauge)
        //Gaute Chart = 100 Voltage = 5 Million (Maximum Amount on Gauge)
        //16666 = 1 voltage percent (3 million between Min and Max / 145 degrees for the servo)
        //2 Million is the bottom of gauge so subtract that from the start.
        integer outputPercent = math.round((((Decimal) sumVal.getValue()) - 2000000) / 20689.655);
        integer duration = -1; //Set the voltage peermamently until it is overwritten
        
        LittleBitsManager.LittleBitsOutputPayload outputPayload = new LittleBitsManager.LittleBitsOutputPayload(outputPercent,duration);
        System.debug(JSON.serialize(outputPayload));
        return outputPayload;
        
        
    }
}


Step 4 - Invoke the littleBits CloudBit API

Using the same Apex Class "LittleBitsManager" I created in the previous post you can call the CloudBit to set the Servo to match the Dashboard Gauge in Salesforce.com.

This simple line of code will reuse our code from the previous article and use the new code above to call the CloudBit.

LittleBitsManager.sendOutputReq(LittleBitsReportManager.runOpptyGaugeReport());

Executing this line of code will make the "Real Life" Dashboard match the Dashboard in Salesforce.com. Let's see it in action!



You can take this to the next step and make this run via Scheduled Apex every 5 minutes to create a dashboard in your office! Put additional reports and charts together with more CloudBits and littleBit components to create multiple chart dashboards!


Saturday, December 27, 2014

Calling littleBits CloudBit from Salesforce.com

littleBits Overview


Recently I received a littleBits CloudBit starter pack as a holiday gift. For those of you who don't know what littleBits are you check them out here: http://littlebits.cc/  I like to think of them as LEGO's for IoT. I have played with Arduino and a few other maker packs but they all have a pretty steep learning curve. For example, to get my Arduino setup to talk to the cloud I had to setup a Node.JS server on my local mac, program an Arduino Sketch, and mess around with a breadboard breaking multiple transistors along the way. I have shaky hands! I like Arduino but I wanted something that was a little easier to play with. Enter littleBits.

Whereas LEGO's have interlocking blocks, littleBits have interlocking components via tiny magnets. There are several types of littleBits that you snap together to form electronic circuits (power, inputs, outputs, and wires). Below is the very simple circuit I built for this demo. It has only 3 blocks. The USB Power, the CloudBit, and a LED Light.




This makes it super simple to create IoT devices. The cloud module I have also automatically syncs to the littleBits cloud and has REST API access, as well as IFTTT connectors. I wanted more granular control over my integration so I chose to call the CloudBit API directly via REST instead of the IFTTT connectors available.

So the first thing I wanted to do was set this up to work with the Salesforce1 Platform.

Pre-Requisites

1. Salesforce1 Developer Login (Free, Sign Up Here)
2. littleBits CloudKit ($100 US, Here)
3. Setup your littleBits Cloud Account via the CloudBit instructions.

Salesforce Setup

Once you have setup the your littleBit cloud account and done the first tutorial that show you how to set it up, you are ready to move on to calling your device from other cloud platforms via the littleBits Cloud HTTP API. From Salesforce.com we will be using APEX HTTP Callouts to send JSON payloads to our littleBit CloudKit device.

Step 1: Setup the Remote Site

In Salesforce.com Setup navigate to "Remote Site Settings" and create a Remote Site Setting with the remote site url of: https://api-http.littlebitscloud.cc
Once the Remote Site Setting is configured we can work on creating our Apex Class.

Step 2: Create the Apex Class "LittleBitsManager".

Using the Developer Console or the Force.com IDE, create the following simple Apex Class. You will replace XXXXX in the URL with your Device ID and xxxxxxxxxxxxxx in the Authorization Header with the Authorization Token in your littleBits cloud setup.


public class LittleBitsManager 
{
    //If we want to make a callout to littleBits from a Trigger, we have to do it Asynchronoushly.
    @future(callout=true) //callout = true allows HTTP Callout from Apex in Async Context
    public static void sendOutputReqAsync(integer percent, integer duration)
    {
        LittleBitsManager.sendOutputReq(new LittleBitsOutputPayload(percent,duration));
    }
    
    //This method will callout to the littleBits Cloud Device
public static void sendOutputReq(LittleBitsOutputPayload outputPayload)
    {
        HttpRequest req = new HttpRequest(); 

          //Set HTTPRequest Method
          req.setMethod('POST');
        
          //Set HTTPRequest header properties, littleBits takes JSON payloads and is RESTful
          req.setHeader('content-type', 'application/json');
          
          //Set the Endpoint (You should make this a Custom Setting but hard coding for example)
          req.setEndpoint('https://api-http.littlebitscloud.cc/v2/devices/xxxxxxxxxxxx/output');
        
          req.setHeader('Authorization','Bearer xxxxxxxxxxxxxxxx');
         
          //Set the HTTPRequest body
          //req.setBody('{"percent":100, "duration_ms":3000}'); 
          req.setBody(JSON.serialize(outputPayload));
          Http http = new Http();
          
           try {
         
                //Execute web service call here
                HTTPResponse res = http.send(req);
        
                //Helpful debug messages
                System.debug(res.toString());
                System.debug('STATUS:'+res.getStatus());
                System.debug('STATUS_CODE:'+res.getStatusCode());
                
        } catch(System.CalloutException e) {
            //Exception handling goes here....
        }
}
    
    //Request Payload in Apex to hold the values we are sending to littleBits.
    public class LittleBitsOutputPayload
    {
        //Percent is the amount of output voltage you want to send from 0 to 100% as an integer
        public integer percent {get;set;}
        //Duration is the time you want to set the voltage. 3000 milliseconds is the default as specified in API docs.
        public integer duration_ms {get;set;}
        
        public LittleBitsOutputPayload(integer inputPercent, integer inputDuration)
        {
            this.percent = inputPercent;
            this.duration_ms = inputDuration;
}
}
    
}


Step 3 - Invoke the LittleBitsManager Apex Class from a Apex Trigger

This simple Apex Trigger will invoke the littleBits Cloud Asynchronously via the above Apex code. Our simple use case is when someone closes an opportunity in Salesforce we should light up an LED.

trigger OpportunityTrigger on Opportunity (after update) 
{
    //Always put this logic in a Trigger Handler never in trigger code
    //For example for littleBits this will be ok
for(Opportunity oppty : trigger.new)
    {
        //If an Opportunity has a been moved to a stage of Closed Won let the team know! Light up the Board via littleBits
        if(oppty.Stage == 'Closed Won' && trigger.oldMap.get(oppty.ID).Stage != 'Closed Won')
        {
            LittleBitsManager.sendOutputReqAsync(100,10000); //100% Power, 10 Seconds (10,000 milliseconds)
        }
    }
}


Step 3 - Test it Out

Here is a simple demonstration showing everything running together. When the Apex Trigger fires the call is made to the CloudBit to light up for 10 seconds. This is a very simple use case but shows how simple it is to get the "plumbing" working between the two platforms. With 100's of different cloudbit connectors and the flexible platform of Salesforce 1 you can literally build any type of IoT application you can think of rapidly!



Note: Reid Carlberg from the Salesforce.com Evangelist team already did a very similar setup here where he uses IFTTT and Twillio / SMS to activate a CloudBit. If you want to see how to use IFTTT and not code directly against the CloudBit API it's worth a view! (https://www.youtube.com/watch?v=Pa9QrwEhuSA)

Sunday, November 23, 2014

A Simple PDF Form Service with the Heroku Deploy Button

I recently took the sample PDF Forms Service I built for Dreamforce 2014 and have made it available via GitHub and the Heroku Deploy button.

 The Heroku Deploy button is pretty slick. It will automatically provision an App on Heroku for you with a click of the button! So know you can not only look at my source code on GitHub, but you can also instantly spin up the Form Service Sample App on your own Heroku account! With just a click of a button!

Here is the Simple REST PDF Form Service Heroku Deploy Button! Go forth and build!

Deploy 

If you have question on how the actual Form Service works check out my session from
Dreamforce 2014 here:




Friday, August 29, 2014

Apex Unit Tests - 4 Random Successful Implementation Tips

Any experienced Salesforce.com developer will tell you that Apex Unit Tests are critical to the success of any Salesforce.com implementation. This is especially critical in complex Salesforce.com environments where we have multiple lines of business using the system. Imagine if you have 10 completely different business units using the same objects with different record types, workflow rules, visualforce pages, batch processes, triggers, etc.

Here are some random thoughts on how you can get the most of out your Unit Tests. These have helped me in the past.



  1. Build is not complete without Apex Unit Tests
  • Unit tests should be completed as functional code is delivered. A story or task should not be considered complete until the accompanying unit tests are built. As yourself "Can I deploy this functionality to production successfully? Does it have code coverage?". SFDC will stop you without sufficient code coverage.
  1. Apex Unit Tests without system asserts are worthless for support and maintenance
  • Unit tests need to have system.assert() to be of any practical value. If a unit test executes solely for the purpose of code coverage than it is not doing its true job, which is to guarantee system behavior. The developer who changes this code may think everything is fine because your unit test did not fail, but his change actually broke the runtime code. 
  1. Deployment between sandboxes must pass all unit tests
  • The Salesforce platform only requires unit test coverage for production deployment. If you bake this into your deployment processes you will ensure success in a large enterprise SDLC that has multiple environments.
  1. If you have complex workflows, validation rule logic or Roll-Up Summary Fields (RUS) build some Apex Unit Tests to validate it even if there is not Apex Code.
  • You can use unit tests to validate your "clicks not code" functionality if it is significantly complex. For example if you have RUS fields that have complex filters and get used across several objects for logic. Putting Apex Unit tests in place can help ensure future changes to these items will not corrupt your data.








Wednesday, June 18, 2014

Converting Base 10 to Base 32 (Or Any Base) in Apex

Recently I had the need to convert a base 10 number into a base 32 number in Salesforce. For my project I had a need to integrate SFDC with a legacy AS400 database to store records (contract records).

The Contract Number Requirements

The legacy database had an existing field that had a length of 9 for the contract number, with the first 4 characters a standard prefix, and each number which had to be unique.

For example the contract number looked like "POLY48GH3" where POLY was the prefix, and 48GH3 was the number. If you've ever worked with legacy AS400 databases you know they love to limit the amount of characters in a field! Furthermore we couldn't use the characters W,X,Y, or Z in the field and could only use 0-1, and A-V as uppercase only.

Fun requirements right!

The Solution

We quickly identified that if we used base 32 numbers we could get the most unique numbers out of the 5 available characters per our requirements (32^5=33554432 available numbers).  In Java you can achieve easily this via the standard class/method Integer.toString(integerVal,radix) where you can specify your radix (base). There is no corresponding method in Apex!

You can use EncodingUtils in apex to do Base64 and even Base128, but if you need to tailor your base you are out of luck. To do custom base conversions you need to write your own method to accomplish this, which I have provided below. You can tweak this method to create different base conversions as necessary. Also you could probably write this in a Formula Field with some additional work if you didn't want to do it in Apex.

I hope this helps folks. A big shoutout to the computer science blog from Erik Oosteral which outlines this logic for those who are interested.

-------------------
public static String generateBase32ContractNumber(Decimal decimalValue, Integer contractNumberLength)
{
try
{
Integer inputBase = 10;
Integer outputBase = 32;
String outputValue = ''
String numericBaseData = '0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ';
Integer x;
Integer maxBase = NumericBaseData.length();
if(DecimalValue == 0)
{
return '0';
}
else
{
while(decimalValue > 0)
{
      X = (Integer)(((DecimalValue/outputBase) - (Integer)(DecimalValue/OutputBase))* OutputBase + 1.5);
      System.debug('x' + x);
      OutputValue = NumericBaseData.subString(X - 1,X)+OutputValue;
      DecimalValue = Integer.valueOf(DecimalValue/OutputBase);
}
}
//We want to ensure all characters have a value. So if the Base 32 number is 10, and our Contract Number lenght is 5, we want to make the output String "00010"
while(OutputValue.length() < contractNumberLength)
{
OutputValue = '0' + OutputValue;
}
return OutputValue;
}catch(Exception e)
{
throw new Custom_Exception('There was an error converting the base values:' + e.getMessage());
}

}

Saturday, March 22, 2014

Instrument Your Apex Code : DIY Performance Metrics on Force.com

Performance Monitoring on Force.com

There are many instances when you develop on Force.com where you will have to gather performance metrics on your codebase. You may want to log how long a HTTP Callout in Apex code takes, or how long an Apex Trigger runs with complex sharing rules. Gathering these metrics is especially important on the Force.com platform due to Governor Limits and the impact of having long running apex transactions. Unfortunately the native logs on SFDC do not allow you to log more than 20 transactions per user per transaction batch. Also these logs are not easily reportable.

This can present a clear challenge to developers but it can be easily remedied. In this blog post I will show you how you can roll your own Apex performance monitoring framework. You will be able to log performance on your Apex code, easily toggle the monitoring on or off without a deployment of code, and run reports and dashboards on your metric data. All this natively on the Force.com platform! No additional tools or vendor products need apply!

Step 1: Create a Custom Object For Performance Logs

Our first step will be to create a custom object to hold our performance logs. This object will have only a few fields. 

Name - An Autonumber for the transaction in the format of PERF-{00000001}
Apex Class - A string holding the name of the Apex Class for the transaction
Apex Method - A string holding the name of the Apex Method for the transaction
Time in Milliseconds - A number holding the transaction time in milliseconds
Time in Seconds - A formula field based on Time in Milliseconds for showing time in seconds
Notes - Any developer notes you want to add for details about the transaction (Record Ids, etc)
Created By - The user who was executing the code

The total object looks like this. Note: Make sure to "Enable Reports" so we can write native reports!


Bonus: Create a Custom Object Tab and expose the object in your UI if you want.

Step 2: Create a Custom Setting to Toggle Logs On/Off

Now that we have our custom object we need to create a Custom Setting that will allow us to toggle the logs on or off without changing any code. Custom Settings were specifically built to set Application Wide settings like this and is the perfect use case for Custom Settings. We are going to create a Custom Setting called "Global Setting" this is Setting Type "List". 

We are only going to create one custom field called "Log Transactions". 

Once we are done your Custom Setting will look like this:

Step 3: Create our Apex Monitoring Framework Class

Now that we have our data model complete we can build out Apex framework class for monitoring transactions. The class itself is very simple and really only does 3 things: 
1. It automatically logs the transaction time by calculating the time in milliseconds.
2. It reads the Custom Setting to see if logging is set to true.
3. It will insert a log record via DML insert on the System Performance Log.

Here is the complete Apex Class:

public without sharing class SystemPerformanceDAO
{
 public static void insertPerformanceLog(PerformanceTransaction logRec)
  {
    logRec.endTimeMilliseconds = DateTime.Now().getTime();
    Global_Setting__c globalSetting = Global_Setting__c.getInstance('Global_Setting');
    if(globalSetting != null && globalSetting.Log_Transactions__c == true)
    { 
      insert new System_Performance_Log__c(Apex_Class__c = logRec.apexClassName, Apex_Method__c = logRec.apexMethodName, Notes__c = logRec.notes, Time_in_Milliseconds__c = (logRec.endTimeMilliseconds - logRec.startTimeMilliseconds));
    }
  }

  //Class used to maintain the State of the Transaction for use by Apex Classes.
  public class PerformanceTransaction
  {
    public long startTimeMilliseconds {get;set;}
    public long endTimeMilliseconds {get;set;}
    public String apexClassName {get;set;}
    public String apexMethodName {get;set;}
    public String notes {get;set;}
    
    public PerformanceTransaction(String apexClass, String apexMethod, String innotes)
    {
      apexClassName = apexClass;
      apexMethodName = apexMethod;
      notes = innotes;
      startTimeMilliseconds = DateTime.Now().getTime();
      endTimeMilliseconds = 0;
    }
  }
}

Step 4: Instrument our Apex Classes with the Framework

Now that we have our data model and Apex framework classes we can start to instrument our Apex classes and methods. To do this we only add two lines of code to the methods we want to instrument. 

The first line of our Apex methods should invoke the constructor in our framework for a transaction log like this:

SystemPerformanceDAO.PerformanceTransaction performanceLog = new SystemPerformanceDAO.PerformanceTransaction('AccountController','refreshContacts','Account Used: ' + acct.Id);

And the last line of our method should simply invoke our Apex class to log the metric like this:

SystemPerformanceDAO.insertPerformanceLog(performanceLog); 

This makes it very easy for us to add instrumentation to our Apex classes with minimal code. And we don't have to change the code since the framework automatically does that for us by checking the custom setting.

A complete example may something like this:

public with sharing class AccountController
{
    public List<ContactListWrapper> acctContacts {get;set;}
    public Account acct {get;set;}

    public AccountController(ApexPages.StandardController std)
    {
        acct = (Account) std.getRecord();
        acctContacts = new List<ContactListWrapper>();
        List<Contact> contacts = [Select Id, Email, FirstName, LastName from Contact where AccountId =: acct.Id];
        for(Contact c : contacts)
        {
            acctContacts.add(new ContactListWrapper(c));
        }
    }


    public ApexPages.PageReference refreshContacts()
    {
        SystemPerformanceDAO.PerformanceTransaction performanceLog = new SystemPerformanceDAO.PerformanceTransaction('AccountController','refreshContacts','Account Used: ' + acct.Id);
        List<Contact> contacts = [Select Id, Email, FirstName, LastName from Contact where AccountId =: acct.Id];
        for(Contact c : contacts)
        {
            acctContacts.add(new ContactListWrapper(c));
        }
        SystemPerformanceDAO.insertPerformanceLog(performanceLog);       
        return null;
    }
    
    public ApexPages.PageReference saveContactUpdates()
    {
        SystemPerformanceDAO.PerformanceTransaction performanceLog = new SystemPerformanceDAO.PerformanceTransaction('AccountController','saveContactUpdates','Account Used: ' + acct.Id);
        List<Contact> contactUpdates = new List<Contact>();
        for(ContactListWrapper clw : acctContacts)
        {
            if(clw.isSelected)
            {
                contactUpdates.add(clw.acctContact);
            }
        }
        if(contactUpdates.size() > 0)
        {
            upsert contactUpdates;
        }
        SystemPerformanceDAO.insertPerformanceLog(performanceLog);
        return null;
    }

    public class ContactListWrapper
    {
        public boolean isSelected {get;set;}
        public Contact acctContact {get;set;}
        
        public ContactListWrapper(Contact c)
        {
            acctContact = c;
            isSelected = false;
        }
    }
}

Step 5: Instrument our Apex Classes with the Framework

Now that we have our classes instrumented we can use the native Force.com Analytics to build our reports. For this example I create a report to show the average, maximum, and minimum transaction times for each of my controller methods.



Conclusions and Caveats

As you can see in this example, it is fairly simple to roll your own instrumentation on Force.com for performance metrics. One of the caveats is that this approach will consume managed data on the platform. So you will need to monitor your logs data usage and periodically purge the records as needed if you don't have a lot of managed data. 

The other caveat is that this will not work for instrumenting Visualforce Apex controller constructor methods. Visualforce Apex controller constructors do not allow you to perform DML statements.

I hope you found this helpful.