Stat Tracker

Saturday, March 22, 2014

Instrument Your Apex Code : DIY Performance Metrics on Force.com

Performance Monitoring on Force.com

There are many instances when you develop on Force.com where you will have to gather performance metrics on your codebase. You may want to log how long a HTTP Callout in Apex code takes, or how long an Apex Trigger runs with complex sharing rules. Gathering these metrics is especially important on the Force.com platform due to Governor Limits and the impact of having long running apex transactions. Unfortunately the native logs on SFDC do not allow you to log more than 20 transactions per user per transaction batch. Also these logs are not easily reportable.

This can present a clear challenge to developers but it can be easily remedied. In this blog post I will show you how you can roll your own Apex performance monitoring framework. You will be able to log performance on your Apex code, easily toggle the monitoring on or off without a deployment of code, and run reports and dashboards on your metric data. All this natively on the Force.com platform! No additional tools or vendor products need apply!

Step 1: Create a Custom Object For Performance Logs

Our first step will be to create a custom object to hold our performance logs. This object will have only a few fields. 

Name - An Autonumber for the transaction in the format of PERF-{00000001}
Apex Class - A string holding the name of the Apex Class for the transaction
Apex Method - A string holding the name of the Apex Method for the transaction
Time in Milliseconds - A number holding the transaction time in milliseconds
Time in Seconds - A formula field based on Time in Milliseconds for showing time in seconds
Notes - Any developer notes you want to add for details about the transaction (Record Ids, etc)
Created By - The user who was executing the code

The total object looks like this. Note: Make sure to "Enable Reports" so we can write native reports!


Bonus: Create a Custom Object Tab and expose the object in your UI if you want.

Step 2: Create a Custom Setting to Toggle Logs On/Off

Now that we have our custom object we need to create a Custom Setting that will allow us to toggle the logs on or off without changing any code. Custom Settings were specifically built to set Application Wide settings like this and is the perfect use case for Custom Settings. We are going to create a Custom Setting called "Global Setting" this is Setting Type "List". 

We are only going to create one custom field called "Log Transactions". 

Once we are done your Custom Setting will look like this:

Step 3: Create our Apex Monitoring Framework Class

Now that we have our data model complete we can build out Apex framework class for monitoring transactions. The class itself is very simple and really only does 3 things: 
1. It automatically logs the transaction time by calculating the time in milliseconds.
2. It reads the Custom Setting to see if logging is set to true.
3. It will insert a log record via DML insert on the System Performance Log.

Here is the complete Apex Class:

public without sharing class SystemPerformanceDAO
{
 public static void insertPerformanceLog(PerformanceTransaction logRec)
  {
    logRec.endTimeMilliseconds = DateTime.Now().getTime();
    Global_Setting__c globalSetting = Global_Setting__c.getInstance('Global_Setting');
    if(globalSetting != null && globalSetting.Log_Transactions__c == true)
    { 
      insert new System_Performance_Log__c(Apex_Class__c = logRec.apexClassName, Apex_Method__c = logRec.apexMethodName, Notes__c = logRec.notes, Time_in_Milliseconds__c = (logRec.endTimeMilliseconds - logRec.startTimeMilliseconds));
    }
  }

  //Class used to maintain the State of the Transaction for use by Apex Classes.
  public class PerformanceTransaction
  {
    public long startTimeMilliseconds {get;set;}
    public long endTimeMilliseconds {get;set;}
    public String apexClassName {get;set;}
    public String apexMethodName {get;set;}
    public String notes {get;set;}
    
    public PerformanceTransaction(String apexClass, String apexMethod, String innotes)
    {
      apexClassName = apexClass;
      apexMethodName = apexMethod;
      notes = innotes;
      startTimeMilliseconds = DateTime.Now().getTime();
      endTimeMilliseconds = 0;
    }
  }
}

Step 4: Instrument our Apex Classes with the Framework

Now that we have our data model and Apex framework classes we can start to instrument our Apex classes and methods. To do this we only add two lines of code to the methods we want to instrument. 

The first line of our Apex methods should invoke the constructor in our framework for a transaction log like this:

SystemPerformanceDAO.PerformanceTransaction performanceLog = new SystemPerformanceDAO.PerformanceTransaction('AccountController','refreshContacts','Account Used: ' + acct.Id);

And the last line of our method should simply invoke our Apex class to log the metric like this:

SystemPerformanceDAO.insertPerformanceLog(performanceLog); 

This makes it very easy for us to add instrumentation to our Apex classes with minimal code. And we don't have to change the code since the framework automatically does that for us by checking the custom setting.

A complete example may something like this:

public with sharing class AccountController
{
    public List<ContactListWrapper> acctContacts {get;set;}
    public Account acct {get;set;}

    public AccountController(ApexPages.StandardController std)
    {
        acct = (Account) std.getRecord();
        acctContacts = new List<ContactListWrapper>();
        List<Contact> contacts = [Select Id, Email, FirstName, LastName from Contact where AccountId =: acct.Id];
        for(Contact c : contacts)
        {
            acctContacts.add(new ContactListWrapper(c));
        }
    }


    public ApexPages.PageReference refreshContacts()
    {
        SystemPerformanceDAO.PerformanceTransaction performanceLog = new SystemPerformanceDAO.PerformanceTransaction('AccountController','refreshContacts','Account Used: ' + acct.Id);
        List<Contact> contacts = [Select Id, Email, FirstName, LastName from Contact where AccountId =: acct.Id];
        for(Contact c : contacts)
        {
            acctContacts.add(new ContactListWrapper(c));
        }
        SystemPerformanceDAO.insertPerformanceLog(performanceLog);       
        return null;
    }
    
    public ApexPages.PageReference saveContactUpdates()
    {
        SystemPerformanceDAO.PerformanceTransaction performanceLog = new SystemPerformanceDAO.PerformanceTransaction('AccountController','saveContactUpdates','Account Used: ' + acct.Id);
        List<Contact> contactUpdates = new List<Contact>();
        for(ContactListWrapper clw : acctContacts)
        {
            if(clw.isSelected)
            {
                contactUpdates.add(clw.acctContact);
            }
        }
        if(contactUpdates.size() > 0)
        {
            upsert contactUpdates;
        }
        SystemPerformanceDAO.insertPerformanceLog(performanceLog);
        return null;
    }

    public class ContactListWrapper
    {
        public boolean isSelected {get;set;}
        public Contact acctContact {get;set;}
        
        public ContactListWrapper(Contact c)
        {
            acctContact = c;
            isSelected = false;
        }
    }
}

Step 5: Instrument our Apex Classes with the Framework

Now that we have our classes instrumented we can use the native Force.com Analytics to build our reports. For this example I create a report to show the average, maximum, and minimum transaction times for each of my controller methods.



Conclusions and Caveats

As you can see in this example, it is fairly simple to roll your own instrumentation on Force.com for performance metrics. One of the caveats is that this approach will consume managed data on the platform. So you will need to monitor your logs data usage and periodically purge the records as needed if you don't have a lot of managed data. 

The other caveat is that this will not work for instrumenting Visualforce Apex controller constructor methods. Visualforce Apex controller constructors do not allow you to perform DML statements.

I hope you found this helpful. 

Sunday, January 19, 2014

Using Formula Fields to Translate Picklist Values into System Codes

When implementing multiple integrations with a Salesforce.com org it is a common use case to translate picklist values into system code values for legacy systems. For example, if you have a Quote object you may have a picklist called "Quote Payment Type" with valid values of "Cash", "Credit", or "Debit". These are the values that are displayed in Salesforce.com UI and the values persisted in the Quote_Payment_Type__c field on your Quote object.

However, sometimes we need to translate the picklist values into system codes for an integration. If we are integrating our Quote object with a payment system for example via a web service the legacy system may expect a system code instead of the string values displayed. In this case let's say our legacy system is a AS400 mainframe and expects short bytes for the values (Cash == C, Credit == CR, Debit == D). How should we do this?

We could do this in Apex Code and translate these values prior to making a HTTP Callout. However, that assumes that we will only integrate with this system via a HTTP Callout in Apex code. We wouldn't be translating this for Data Loader integrations, or other 3rd party tool integrations that may simply pull the SObject and pass it to the legacy system. Also if we build this translation in the Apex code layer, everytime we add a new translation or change an existing translation we would need to do a code deployment and update our unit test coverage. Ouch!

A much cleaner way to implement this is to use a Formula Field on the object.

In this case we create a new Formula Field called "Quote Type INT".

Formula Field Definition:

IF(ISPICKVAL(Contract_Type__c, 'Cash') , 'C', 
IF(ISPICKVAL(Contract_Type__c, 'Finance'), 'L', 
IF(ISPICKVAL(Contract_Type__c, 'Lease'), 'LE','NA')))

And now the translation is always done in the declaritive layer which is SFDC best practice. Here is what we get in our UI if you chose to display the system code value:



Its a simple pattern that every developer should have in their toolbox. Always remember to use "Clicks not Code" whenever possible to solve your development or integration needs. With this pattern if we add new translations I can simply update the Formula Field and Picklist values without doing a full code deployment!

One thing to consider: If you have hundreds of values you may exceed the Formula Field size. In that case you may need to use a Custom Setting or other mechanism (Static Resources, etc) to hold the translation values. But please don't hard code them!





Monday, December 16, 2013

Capturing Signatures with HTML5 Canvas in Salesforce 1 Mobile

Recently Salesforce.com released Salesforce 1, their latest mobile application. Salesforce 1 releases a number of new features that enable developers to create mobile applications. One person I spoke with recently at Dreamforce wanted to know how they could capture signatures in the mobile application. With HTML5 Canvas, Visualforce, and a little JavaScript, you can easily roll your own lightweight signature capture functionality in Salesforce 1.

Here is a brief demonstration video:


So I have included the entire source code below, but a few items about the tricky parts.

1. You will need to setup JavaScript event listeners on the canvas for touchstart, touchmove, and touchend. That is what the canvas will execute when you touch and drag your finger on it.

2. You will need to use JavaScript Remoting to ensure that you properly pass the Canvas content into your Apex Controller so that it can save it. The canvas can be converted into a Base64 String with the Canvase.toDataURI() method. That is how you get the bytes from the Canvas into an Attachment in Salesforce.com.

These are illustrated in the sample code.

And here is the source code for the VF and Apex. If you put these into a Visualforce Tab, and make it enabled for Salesforce 1 Mobile, then you easily reuse this sample code.

Source Code:

Visualforce Page Code:

<apex:page controller="AnyObjectSignatureController" showheader="false" sidebar="false" standardStylesheets="false">
<script>var $j = jQuery.noConflict();</script>
<apex:stylesheet value="{!URLFOR($Resource.jquerymobile,'/jquerymobile/jquery.mobile-1.3.2.min.css')}"/>
<apex:includeScript value="{!URLFOR($Resource.jquery)}"  />
<apex:includeScript value="{!URLFOR($Resource.jquerymobile,'/jquerymobile/jquery.mobile-1.3.2.min.js')}"/>

<div data-role="page" id="signatureCaptureHome"> 
<div data-role="content">
<input id="accountNameId" type="text" name="accountName"/>
<input type="button" name="findAccountBtn" onclick="findAccounts();" value="Find Accounts"/>
<h1 id="recordSigId">Record Signature:</h1>
<canvas id="signatureCanvas" height="200px" width="300px"/>
<input id="saveSigButton" type="button" name="SigCap" onclick="saveSignature();" value="Capture Signature"></input>
</div> 
</div> 
<div data-role="page" id="signatureCaptureHome"> 
<div data-role="content">
<input id="accountNameId" type="text" name="accountName"/>
<input type="button" name="findAccountBtn" onclick="findAccounts();" value="Find Accounts"/>
</div> 
</div> 

<script>

    var canvas;
    var context;
    var drawingUtil;
    var isDrawing = false;
    var accountId = '';

function DrawingUtil() 
{
    isDrawing = false;
    canvas.addEventListener("touchstart",start,false);
    canvas.addEventListener("touchmove",draw,false);
    canvas.addEventListener("touchend",stop,false);
    context.strokeStyle = "#FFF";  
}

//Start Event for Signature Captuare on HTML5 Canvas
function start(event) 
{
    isDrawing = true;
    canvas = document.getElementById("signatureCanvas");
    context = canvas.getContext("2d");    
    context.strokeStyle = "rgba(155,0,0,0.5)";      
    context.beginPath();
     context.moveTo(event.touches[0].pageX - canvas.getBoundingClientRect().left,event.touches[0].pageY - canvas.getBoundingClientRect().top);
}

//Event while someone is drawing to caputre the path while they draw....
function draw(event) {
    event.preventDefault();
    if(isDrawing) {     
        context.lineTo(event.touches[0].pageX - canvas.getBoundingClientRect().left,event.touches[0].pageY - canvas.getBoundingClientRect().top);
        context.stroke();
    }
}


//Event when someone stops drawing their signature line
function stop(event) {
    if(isDrawing) {
        context.stroke();
        context.closePath();
        isDrawing = false;
    }
}

canvas = document.getElementById("signatureCanvas");
context = canvas.getContext("2d");
drawingUtil = new DrawingUtil(canvas);

function saveSignature()
{
var strDataURI = canvas.toDataURL();
    // alert(strDataURI);
    strDataURI = strDataURI.replace(/^data:image\/(png|jpg);base64,/, "");
//alert(strDataURI);
AnyObjectSignatureController.saveSignature(strDataURI,accountId,processResult);
}

function processResult(result)
{
alert(JSON.stringify(result));
}

function findAccounts()
{
var nameValue = document.getElementById("accountNameId").value;
AnyObjectSignatureController.findAccounts(nameValue, processSearchResult);

function processSearchResult(result)
{
$j = jQuery.noConflict();
//$j("#accountList").html("");
$j.each(result, function(i, record) {accountId = record.Id; $j("#recordSigId").html("Record Signature: " + record.Name);});
$j("#recordSigId").trigger("update");
//$j("#accountList").trigger("update");
//alert(JSON.stringify(result));
}


</script>

</apex:page>

Apex Controller:
global with sharing class AnyObjectSignatureController 
{
public AnyObjectSignatureController()
{
}
@RemoteAction
global static List<Account> findAccounts(String name)
{
name = '%' + name + '%';
List<Account> accounts = [Select Id, Name from Account where Name like :name];
return accounts;
}
@RemoteAction
global static String saveSignature(String signatureBody, String parentId) 
{
try
{
system.debug('Record Id == ' + parentId);
system.debug(signatureBody);
Attachment a = new Attachment();
a.ParentId = parentId;
a.Body = EncodingUtil.base64Decode(signatureBody);
a.ContentType = 'image/png';
a.Name = 'Signature Capture.png';
insert a;
return '{success:true, attachId:' + a.Id + '}';
}catch(Exception e)
{
return JSON.serialize(e);
}
return null;
}

}

Sunday, October 13, 2013

Dreamforce 2013 Sessions - Lets Rock

Dreamforce is upon us! In just a few weeks San Francisco will turn into the mecca for cloud computing with almost 100,000 cloud devotees making the annual pilgrimage.  This will be my third year presenting at Dreamforce and I can honestly say each year the Developer Zone has gotten better and better. This year I will be presenting or contributing on four different sessions. And for the first time I'll be co-presenting a session with another person! I'm excited to be working with Tim McDonald on our administrator and developer session.

Come check out the sessions, contribute to the conversations on the chatter feeds, and get your brain ready for data downloads!

Case Study: Building a Mobile App for Field Services

Wednesday, November 20th: 4:00 PM - 4:30 PM
Moscone Center West, Mobile Theater

Description

The Salesforce Platform allows you to architect complete solutions for entire lines of business, whether it's desktop or mobile users. Join us as we focus on how users can build a fully featured mobile solution for Field Service engineers. 

By dissecting an HTML 5 Hybrid Application for the Service Cloud, you'll get exposure to building a complete mobile application using the jQuery Mobile for UI, Salesforce Mobile SDK for Security &amp; REST API Access, NFC Phonegap Plugin for Serial Number Scanning and Automatic Case Assignment, HTML5 Canvas for Signature Capture Camera Access for Case Documentation &amp; Attachemnt on the Case in Salesforce, and Chatter API for Social Feeds on Cases.
Speakers:
Cory CowgillThe Warranty Group

Apex Trigger Debugging: Solving the Hard Problems

Wednesday, November 20th: 11:45 AM - 12:30 PM
Moscone Center West, 2020
Full

Description

Apex Triggers can be your best friend or your worst enemy. When a trigger is firing properly your data is under control and remains sane, but when a trigger doesn't fire properly, your users can be faced with the frustration of exceptions when saving a record, or worse: incorrect data. Join us to learn tips and tricks on how to debug and solve the most complex issues, including: Ambiguous Field Validation, After Insert Activity Errors, and SOQL and Governor Limit Errors. You'll learn the origins of these kinds of advanced trigger issues and gain solutions for avoiding them.
Speakers:
Cory Cowgill

Clicks AND Code: A Dreamforce Session for Administrators AND Developers

Wednesday, November 20th: 9:00 AM - 9:50 AM
The Westin St. Francis San Francisco, California West
Full

Description

Administrators seem to have adopted the mantra of “Clicks not Code.” However, more often than not, the customization of the Salesforce Platform through the use of code provided by a developer is not only necessary, but required for a successful implementation. Join us as we present best practices for administrators to use when engaging their developer counterparts, while providing some tips and tricks for developers to quickly respond to the requests placed before them.
Speakers:
Cory CowgillWest Monroe Partners
Tim McDonaldNew Tangram, LLC

Force.com Careers: How Do I Get There From Here?

Thursday, November 21st: 11:00 AM - 11:45 AM
Moscone Center West, 2020

Description

Do you love developing on the Salesforce Platform, but wonder what the next steps are for your career? Join our panelists to hear about various career paths, including Consultant, Architect, Product Manager, and AppExchange Developer, to name a few. These experts will share the pros and cons of their careers and also the path to get there.
Speakers:
Carolina Ruiz MedinaFinancialForce.com
Cory CowgillWest Monroe Partners
Leah McGowen-Haresalesforce.com
Cheryl Porrosalesforce.com
Ayori Selassiesalesforce.com
Andy OgnenoffCloud Sherpas
Kevin O'HaraLevelEleven


Can't get into one of my sessions because its full? Don't worry, sessions, presentations and source code will be distributed to the general public after the sessions. Have a question for me? Hit me up in the Developer Zone during the conference. I usually camp out there either by the hackathon, theaters, or coffee station. Hit me up in dreamforce chatter or on twitter @corycowgill and I'm happy to discuss anything Force.com related.

Looking forward to another awesome year!



Wednesday, July 24, 2013

Force.com Data Model - Enumeration Tables versus Picklists

The Salesforce Platform allows customers to build robust, relational data models to suit any need. In fact, with tools like Schema Builder it is so simple to get started building that it can be a bit of a double-edged sword. The simplicity allows functions that once rested solely in the hands of a Database administrator to be performed by a Business Analyst, or even the End Users. However, with great power always comes great responsibility.

The number one problem I have encountered working with clients who have performed Salesforce.com self-implementations is data model related. There are several common mistakes self implementers should avoid. In this blog post I'll be discussing how heavy data normalization can work against you on the platform.

This problem often occurs when the implementation was run by an internal IT team who have traditional SQL skill sets. They will create a custom object for every single enumeration table they think is needed without regards to how SFDC relational data models actually work (picklists for example).
Heavily Normalized Data Model in MySQL

This leads to headaches when building standard reports, and usability issues when viewing and editing data with standard SFDC pages. For the above example the mult-select picklist for "Payment Options" would show up as a Related List, and the Marketing Status would show up as a Lookup. If we created this same data model in SFDC it would look like this:
Erroneously built Data Model in SFDC - Heavily Normalized Data Model in SFDC

And this would manifest itself on the Standard UI as this:

Illustration: Ugly UI

The Related List at the bottom "allows" for the multi select picklist, and the lookup in the detail section allows for the lookup. This is very nasty for end users! Imagine if you had dozens of multi-select picklists! You would have dozens of related lists! And the users would have to click multiple times to enter a payment option, and they need to do a lookup each time for the marketing status.

Not to mention they can't easily filter on Payment Options for List Views and Reports.

This can easily be corrected by using picklist and multi-select picklists in SFDC. If we use those types of fields our data model removes those 3 objects and everything resides on the Company object. The correct data model looks like this:
Who! Only 1 object! 


And it manifests itself in the UI like this:
Much Cleaner!

This is much cleaner in the UI, allows easy reporting and filtering, and saves us 3 objects we don't have to build on the back end.

In future posts I'll discuss the inverse problem where data is too heavily de-normalized on the platform.

In short, the key to building successful data models on the platform is to delicately balance the need for custom objects ("tables") and features of the Salesforce platform (picklists, multi-select picklists, record types, etc).




Sunday, June 16, 2013

Inserting PDF Attachments into Salesforce.com using Talend and iText

I recently presented at the local Chicago Force.com Developer Group on using Talend to move data into and out of Salesforce.com. One question I fielded was how about moving documents into Salesforce.com. Yes you can use Talend to do this! For this demonstration I am going to use a few components and show you how to get the class path to execute properly for the ETL job.

Use Case:


For this demonstration, we are going to dynamically generate PDF content inside our ETL job. We are going to extract all our Account records from SFDC and dynamically generate PDF content from the fields on the Account records. We will then inser that data into an Attachment record for each of the Accounts as a PDF document created using iText.

Step 1 - Create Your Salesforce.com Metadata Connection

The first step is to create your connection for SFDC metadata. I have a video on YouTube which shows you how can do this. For this example you will want to pull in the Accounts and Attachements objects from SFDC. After following the instructions in the video for creating your SFDC metadata connections you should have them in your repository like figure below.



Step 2 - Download iText PDF Library

You will need the iText PDF library for the portions of the ETL job which will generate dynamic PDF content. You will also need to add the iText jar file to you java build path. You can do this by using the User Libraries feature as shown below:

Talend -> Preferences -> Java -> Build Path -> User Libraries


Step 3 - Create a new Job Design and Load Java Libraries

Talend allows you to load external Java libraries into your job which can have code executed inside your job. The component to load external Java libraries is called tLibraryLoad. You should use this component as one of the first steps in your job to load the any dependent jars you need. In this case, we are going to load the iText PDF library, as well as Apache Commons Codec so that we can Base64 encode our Attachment file content (more on that later).

Using tLibraryLoad component to load Java jar files.



Step 4 - Query SFDC Account Records

This step we simply use the tSalesforceInput component to read Accounts. For details you can see the video above on how to the Account records.


Step 4 - Create the Account Record PDF using tJavaRow

The tJavaRow component is a very powerful component. It allows you to code functionality into your ETL job using Java. In this example we are using the iText PDF Library to generate a simple PDF. The thing to remember about tJavaRow is that the code will execute for each record in the input step.

When you add and connect your tJavaRow component to the tSalesforceInput component, the first thing you need to do is click on the Advanced Settings tab and add any import for libraries that you need. These classes should reside in the jars you loaded in the previous step inside tLibraryLoad component.

Import Libraries for iText and Apache Commons Codec


Once you have the imports setup like above, you can then go to the Basic Settings tab. This will present you with the tJavaRow code editor where you can add your code. The first thing I do is click "Sync Columns" and "Generate Code" buttons. This will automatically create the code to simply move all the data from the input into your output row for this component.

Auto Code Generation and Column Sync


After I have let Talend do the heavy lifting of generating my getter and setter code, I then add two new fields. One called 'FileName' and one called "Content_Body". These fields will hold the filename in the format "Account Name.pdf" and the actual file content as a Base64 encoded string.

Add Fields to the Mapping for the Content and File Name



 The Salesforce API's use Base64 Strings to encode and pass file content. That is the reason we need to use Apache Commons in our job, to convert the PDF bytes into a format that SFDC and ingest. Using the output and input variable in the code, we can generate a PDF using the simple iText PDF objects. Here is the complete code. All this code goes inside the tJavaRow component.


//Code generated according to input schema and output schema
output_row.Id = input_row.Id;
output_row.IsDeleted = input_row.IsDeleted;
output_row.MasterRecordId = input_row.MasterRecordId;
output_row.Name = input_row.Name;
output_row.Type = input_row.Type;
output_row.ParentId = input_row.ParentId;
output_row.BillingStreet = input_row.BillingStreet;
output_row.BillingCity = input_row.BillingCity;
output_row.BillingState = input_row.BillingState;
output_row.BillingPostalCode = input_row.BillingPostalCode;
output_row.BillingCountry = input_row.BillingCountry;
output_row.ShippingStreet = input_row.ShippingStreet;
output_row.ShippingCity = input_row.ShippingCity;
output_row.ShippingState = input_row.ShippingState;
output_row.ShippingPostalCode = input_row.ShippingPostalCode;
output_row.ShippingCountry = input_row.ShippingCountry;
output_row.Phone = input_row.Phone;
output_row.Fax = input_row.Fax;
output_row.AccountNumber = input_row.AccountNumber;
output_row.Website = input_row.Website;
output_row.Sic = input_row.Sic;
output_row.Industry = input_row.Industry;
output_row.AnnualRevenue = input_row.AnnualRevenue;
output_row.NumberOfEmployees = input_row.NumberOfEmployees;
output_row.Ownership = input_row.Ownership;
output_row.TickerSymbol = input_row.TickerSymbol;
output_row.Description = input_row.Description;
output_row.Rating = input_row.Rating;
output_row.Site = input_row.Site;
output_row.OwnerId = input_row.OwnerId;
output_row.CreatedDate = input_row.CreatedDate;
output_row.CreatedById = input_row.CreatedById;
output_row.LastModifiedDate = input_row.LastModifiedDate;
output_row.LastModifiedById = input_row.LastModifiedById;
output_row.SystemModstamp = input_row.SystemModstamp;
output_row.LastActivityDate = input_row.LastActivityDate;
output_row.CustomerPriority__c = input_row.CustomerPriority__c;
output_row.SLA__c = input_row.SLA__c;
output_row.Active__c = input_row.Active__c;
output_row.NumberofLocations__c = input_row.NumberofLocations__c;
output_row.UpsellOpportunity__c = input_row.UpsellOpportunity__c;
output_row.SLASerialNumber__c = input_row.SLASerialNumber__c;
output_row.SLAExpirationDate__c = input_row.SLAExpirationDate__c;
output_row.Location__Latitude__s = input_row.Location__Latitude__s;
output_row.Location__Longitude__s = input_row.Location__Longitude__s;

//CREATE PDF CODE - FROM ITEXT PDF EXAMPLE        
        ByteArrayOutputStream bos = new ByteArrayOutputStream();
        Document document = new Document();
        // step 2
        PdfWriter.getInstance(document, bos);
        // step 3
        document.open();
        // step 4
        document.add(new Paragraph("Account Name: " + output_row.Name + "/nAccountID: " + output_row.Id));
        // step 5
        document.close();
        
        byte[] bytes = bos.toByteArray();
        String stringToStore = new String(Base64.encodeBase64String(bytes));
        output_row.Content_Body = stringToStore;

        output_row.FileName = input_row.Name + ".pdf";



Step 5 - Insert the PDF into the Attachment Record 



Finally we pass along the values using a tMap component and the tSalesforceOutput component for Attachments. We map the output row fields from the tJavaRow onto our tSalesforceOutput row.

tMap Component


The complete demo job looks like this:



Step 6 - Export the Job and Run


The last step is exporting the job and running it on your machine. If you export the job design as a zip, you can get to the AccountPDFJob.sh file which executes the job. This video shows it running and you can see the final generate PDF attachment in SFDC.


I hope this helps folks. There are a lot of uses for this. You could generate PDF documents out of SFDC for all sorts of things with this approach. You could build a job that merges your SFDC data with an ERP and create a PDF invoice for example. Or you could use the Chatter objects and post files to a persons Chatter feed. There are lots of scenarios.