Stat Tracker

Friday, August 26, 2016

Winter 17 Release - Governor Limit Increases You Should Know

Governor Limit Increases in SFDC Winter 17 Release

These updates require no action and are automatically granted in the release. Load more records, relate more objects in formulas, and generally increase the scale of your solutions without doing anything!

1. Process Twice as Many Records with Bulk API

You know what’s better than being able to upload 5,000 batches a day with Bulk API? If you guessed, “being able to upload 10,000 batches a day,” you’re right! The daily batch limit has been increased to 10,000 for all orgs.

2. Create More Spanning Relationships Per Object in Formulas

Sometimes you want more! We’ve increased the number of unique relationships per object from 10 to 15. This increase is available in both Lightning Experience and Salesforce Classic.

3. Daily Org Limits for Sending Emails with the API Have Increased

Using the Salesforce API or Apex, you can now send single emails to 5,000 external email addresses per day based on Greenwich Mean Time (GMT). You can also send mass email to 5,000 external email addresses per day per org. The maximum number of external email addresses is no longer based on your Salesforce edition. You can use your remaining daily balance of external email addresses in as many mass emails as you’d like, regardless of your edition. This feature is available in both Lightning Experience and Salesforce Classic.

4. Get More Days to Schedule Your Quick Deployments

The time window to quick-deploy your validations has expanded from 4 days to 10 days. This larger time window provides you more flexibility for scheduling your quick deployment and helps minimize the impact on your org.

5. Higher Limits for Standard Picklists

Standard, multi-select picklists can be as detailed as you need them to be with a new limit of 255 characters per entry. This feature is available in both Lightning Experience and Salesforce Classic.

6. Make More API Calls and Get Fewer Headaches When Calculating API Limits

We simplified the API request limit calculation and gave everyone more calls per 24-hour period. For Enterprise Edition, Unlimited Edition, Performance Edition, and Professional Edition with API access enabled, the old calculation was based on your number of licenses and the license types, with a guaranteed minimum of 15,000 calls per 24-hour period. We scrapped the minimum and gave everyone 15,000 more calls. The calculation for Developer Edition orgs and sandboxes remains the same.

Tuesday, May 24, 2016

Passing the Salesforce Certified Integration Architecture Exam

I recently passed the Salesforce Certified Integration Architecture Exam. The full exam details can be found at the SFDC Architect Academy. This is one of the 9 Architect Domain Specialist exams. In this post I'll be providing my general thoughts on the content of the exam. Here is the description of the exam from the Architect Academy:

Scenario Based Questions

The exam asks multiple scenario based questions where you need to choose the right integration pattern given a set of criteria. For example a question may look something like:

Delivery Express has a Org that is holds all the customer information including Accounts and Opportunities. Delivery Express has a on-premise ERP system that is the order fulfillment system. The ERP System has a SOAP API for creating and updating orders and is offline occasionally for maintenance. Delivery Express has a new requirement whenever an Opportunity is Close-Won an order should be created in the ERP. The order needs to be near real time and custom development work is not an option.

Which technology should the Integration Architect select based on the above?

A. Outbound Messaging
B. Apex Callout
C. ETL Tool
D. Canvas App

The above question can be answered with A because Outbound Messaging can be setup custom development, can retry failed messages automatically, and supports SOAP API endpoints.

Know Your Integration Options


You need to know when to use a mashup. If the requirement is to simply view the data in SFDC UI then you can look at doing Canvas or Custom Web Tab. Use a Canvas if you need authentication and Custom Web Tab if you need to surface an external website that is unauthenticated.

Middleware, ESB & ETL

You need to know when to use ETL and Middleware to solve your problem. Do you need complex error handling and orchestration across multiple systems? Then you're going to need to use an ESB, Middleware, or ETL tool. Do you have a data synchronization that does not need to be real time but handle large data volumes? Again an ETL tool using Bulk API is going to be your best bet.

Apex Web Services

You need to know when it makes sense to incur the technical debt to build out a Apex Web Service. Do you need to multiple DML statements on the SFDC Tier as an atomic operation? Than you need to look at using an Apex Web Service to handle the transaction as a single API call.

Apex Callouts

How do Apex Callouts work? You need to know that an Apex Callout cannot fire inside an Apex Trigger. A trigger has to invoke a @future method that in turns does the HTTP Callouts.

Bulk API 

You should know when and how to use the Bulk API. How can you monitor a Bulk API job? Through the UI and through API calls? What is the difference between Serial and Parallel modes? What API's support Serial and Parallel?

Avoid Record Lock Scenarios and Performance Issues

The answer is not to always do the work on the SFDC Tier. Many times you want to pre-process your data before you load it into SFDC. One example is to group your data by a parent record like Account so that Bulk API jobs do not fail due to parent record locking. Also for example if you are integration large volumes of data you do not want to do upsert you want to do separate insert and update calls respectively, forcing you to pre-process those ID's in your data set.

Want to learn even more? Join my team!

I am currently looking for the next set of Engineers to mentor. If this type of technical content and challenge seem appealing to you, and you want to become an all star Engineer across the entire PaaS Stack, reach out to me on Twitter or Linkedin. I am looking for several USA based engineering roles to join my engineering team at Fusion Risk Management (Sorry I cannot provide visa sponsorships ).

Friday, May 13, 2016

Passing the Salesforce Certified Data Architecture and Management Exam

I recently passed the Salesforce Certified Data Architecture and Management Exam. This architect exam is one of 9 new Domain Specialist exams that constitute the SFDC Architect Academy certifications. There is a study guide and more details on the exam here: The number of questions, passing score, and other information can be found there. In this post I am going to talk about the general questions, experience, and content that was on the exam.

Know Large Data Volumes (LDV)

Large Data Volumes refer to Environments that have millions to hundred of millions of records. The underlying multi-tenant architecture of the platform solves lots of problems and creates some interesting ones. LDV environments pose some interesting technical challenges that other orgs don't often face including but not limited to:

Data Skew
Sharing Calculation Time
Upsert Performance
Report Timeouts
Non-Selective Queries / Indexing Considerations / PK Chunking Mechanisms

There are many questions on the exam that cover LDV scenarios. 

How do you build Data Models that support LDV (Skinny Tables, Lookups, ETC)? Do you normalize or denormalize your objects and why?

How do you load millions of records successfully (Use the SOAP Bulk API)?

What settings may you need to turn off during a data load to successfully load data (Workflows, Validation Rules, Defer Sharing Calculations, ETC)? 

How will you need to structure the data operations (Perform Seperate Insert & Update rather than the simpler but more time consuming Upsert)? 

How do you group the data in batches (Group by Parent Lookup for individual batches to avoid record row lock issues)?

My advice is to read the following material to get familiar with LDV:

LDV Best Practices

Optimizing SOQL, List Views, and Reports

Bulk API Data Loading and Parallel Throughput

Primary Key Chunking:

Know Master Data Management

Several of the questions revolve around understanding data management. This includes Data Quality best practices (Using Data Duplicate Matching Rules, Validation Rules, Reports and Dashboards for Data Quality KPI) as well as Data Governance Programs and Systems of Records. There are several questions about maintaining data integrity between SFDC and 3 other systems. How do you maintain a System of Record across the Systems?

My advice is to understand the mechanisms for Data Quality on the Platform including:
Duplicate Management Capability of
Lead Conversion
Merge Accounts

There are also questions about setting up a Data Governance Program. What roles do you need for a Governance Program (Data Stewards, ETC)? This is that architecture considerations and how do you technically run a program as the data architect.

Know Data Loading Options

This is part integration and part data management. You need to know what options are available to integrate data across systems. ETL Tools, Data Loader, SFDC Data Import Wizard, Outbound Messages, SOAP and REST API considerations. For example one question asked about considerations for loading millions of records into the System on a frequent basis (You need to think about API Limits!). Another question asked about using Outbound Messaging to do a outbound read-only synchronization to a ERP system for Opportunities. You should be familiar with all the integration options with respect to data layer integrations.


This exam validates that an architect candidate has performed implementation of, or is familiar with large, complex Salesforce Orgs. I hope the above material will give folks insight into this new exam.

If you want to learn more connect with me on Twitter. I am looking for the next set of great Engineers to work with. If you are in Chicago especially network with me. I am looking to hire talent that I can mentor and grow our Fusion Engineering Team with!