Stat Tracker

Tuesday, May 24, 2016

Passing the Salesforce Certified Integration Architecture Exam

I recently passed the Salesforce Certified Integration Architecture Exam. The full exam details can be found at the SFDC Architect Academy. This is one of the 9 Architect Domain Specialist exams. In this post I'll be providing my general thoughts on the content of the exam. Here is the description of the exam from the Architect Academy:


Scenario Based Questions

The exam asks multiple scenario based questions where you need to choose the right integration pattern given a set of criteria. For example a question may look something like:

Delivery Express has a Salesforce.com Org that is holds all the customer information including Accounts and Opportunities. Delivery Express has a on-premise ERP system that is the order fulfillment system. The ERP System has a SOAP API for creating and updating orders and is offline occasionally for maintenance. Delivery Express has a new requirement whenever an Opportunity is Close-Won an order should be created in the ERP. The order needs to be near real time and custom development work is not an option.

Which technology should the Integration Architect select based on the above?

A. Outbound Messaging
B. Apex Callout
C. ETL Tool
D. Canvas App

The above question can be answered with A because Outbound Messaging can be setup custom development, can retry failed messages automatically, and supports SOAP API endpoints.

Know Your Integration Options

MashUps

You need to know when to use a mashup. If the requirement is to simply view the data in SFDC UI then you can look at doing Canvas or Custom Web Tab. Use a Canvas if you need authentication and Custom Web Tab if you need to surface an external website that is unauthenticated.

Middleware, ESB & ETL

You need to know when to use ETL and Middleware to solve your problem. Do you need complex error handling and orchestration across multiple systems? Then you're going to need to use an ESB, Middleware, or ETL tool. Do you have a data synchronization that does not need to be real time but handle large data volumes? Again an ETL tool using Bulk API is going to be your best bet.

Apex Web Services

You need to know when it makes sense to incur the technical debt to build out a Apex Web Service. Do you need to multiple DML statements on the SFDC Tier as an atomic operation? Than you need to look at using an Apex Web Service to handle the transaction as a single API call.

Apex Callouts

How do Apex Callouts work? You need to know that an Apex Callout cannot fire inside an Apex Trigger. A trigger has to invoke a @future method that in turns does the HTTP Callouts.

Bulk API 

You should know when and how to use the Bulk API. How can you monitor a Bulk API job? Through the UI and through API calls? What is the difference between Serial and Parallel modes? What API's support Serial and Parallel?

Avoid Record Lock Scenarios and Performance Issues

The answer is not to always do the work on the SFDC Tier. Many times you want to pre-process your data before you load it into SFDC. One example is to group your data by a parent record like Account so that Bulk API jobs do not fail due to parent record locking. Also for example if you are integration large volumes of data you do not want to do upsert you want to do separate insert and update calls respectively, forcing you to pre-process those ID's in your data set.

Want to learn even more? Join my team!

I am currently looking for the next set of Force.com Engineers to mentor. If this type of technical content and challenge seem appealing to you, and you want to become an all star Force.com Engineer across the entire Force.com PaaS Stack, reach out to me on Twitter or Linkedin. I am looking for several USA based engineering roles to join my engineering team at Fusion Risk Management (Sorry I cannot provide visa sponsorships ).



Friday, May 13, 2016

Passing the Salesforce Certified Data Architecture and Management Exam

I recently passed the Salesforce Certified Data Architecture and Management Exam. This architect exam is one of 9 new Domain Specialist exams that constitute the SFDC Architect Academy certifications. There is a study guide and more details on the exam here: http://certification.salesforce.com/architects. The number of questions, passing score, and other information can be found there. In this post I am going to talk about the general questions, experience, and content that was on the exam.

Know Large Data Volumes (LDV)

Large Data Volumes refer to Salesforce.com Environments that have millions to hundred of millions of records. The underlying multi-tenant architecture of the Force.com platform solves lots of problems and creates some interesting ones. LDV environments pose some interesting technical challenges that other orgs don't often face including but not limited to:

Data Skew
Sharing Calculation Time
Upsert Performance
Report Timeouts
Non-Selective Queries / Indexing Considerations / PK Chunking Mechanisms

There are many questions on the exam that cover LDV scenarios. 

How do you build Data Models that support LDV (Skinny Tables, Lookups, ETC)? Do you normalize or denormalize your objects and why?

How do you load millions of records successfully (Use the SOAP Bulk API)?

What settings may you need to turn off during a data load to successfully load data (Workflows, Validation Rules, Defer Sharing Calculations, ETC)? 

How will you need to structure the data operations (Perform Seperate Insert & Update rather than the simpler but more time consuming Upsert)? 

How do you group the data in batches (Group by Parent Lookup for individual batches to avoid record row lock issues)?

My advice is to read the following material to get familiar with LDV:

LDV Best Practices

Optimizing SOQL, List Views, and Reports

Bulk API Data Loading and Parallel Throughput

Primary Key Chunking:
https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/async_api_headers_enable_pk_chunking.htm



Know Master Data Management

Several of the questions revolve around understanding data management. This includes Data Quality best practices (Using Data Duplicate Matching Rules, Validation Rules, Reports and Dashboards for Data Quality KPI) as well as Data Governance Programs and Systems of Records. There are several questions about maintaining data integrity between SFDC and 3 other systems. How do you maintain a System of Record across the Systems?

My advice is to understand the mechanisms for Data Quality on the Platform including:

Data.com
Duplicate Management Capability of Salesforce.com
Lead Conversion
Merge Accounts


There are also questions about setting up a Data Governance Program. What roles do you need for a Governance Program (Data Stewards, ETC)? This is that architecture considerations and how do you technically run a program as the data architect.

Know Data Loading Options

This is part integration and part data management. You need to know what options are available to integrate data across systems. ETL Tools, Data Loader, SFDC Data Import Wizard, Outbound Messages, SOAP and REST API considerations. For example one question asked about considerations for loading millions of records into the System on a frequent basis (You need to think about API Limits!). Another question asked about using Outbound Messaging to do a outbound read-only synchronization to a ERP system for Opportunities. You should be familiar with all the integration options with respect to data layer integrations.


Conclusion

This exam validates that an architect candidate has performed implementation of, or is familiar with large, complex Salesforce Orgs. I hope the above material will give folks insight into this new exam.

If you want to learn more connect with me on Twitter. I am looking for the next set of great Force.com Engineers to work with. If you are in Chicago especially network with me. I am looking to hire talent that I can mentor and grow our Fusion Engineering Team with!