Stat Tracker

Tuesday, April 11, 2017

Salesforce Security Review - Security Posture & False Positive Template

When a company submits their application for Salesforce Security Review there is often a need to provide documentation. There is a lot of information on the process and security scans here (https://developer.salesforce.com/page/Security_Review) but this doesn't have any templates or documentation standards a submission should follow for their application.


This is a sample of a template I have used when submitting to Salesforce Security Review that I have found helpful. I have gone through Salesforce Security Review more than 10 times for large OEM managed packages with hundreds of thousands of lines of code. I have found that the more information you can give the security engineers when they review your application the more successful you will be in passing your security review. Performing this documentation as part of your SDLC also helps bake this into your engineering processes, ensuring future reviews are successful.

Here is the sample template for a fake application with some simple data points. For a true enterprise class application this can be a rather lengthy but important document.
Security Review Considerations and False Positives Template

Want to learn more? Join my engineering organization! I'm looking for new talent to mentor in multiple engineering roles! http://www.fusionrm.com/careers

Friday, August 26, 2016

Winter 17 Release - Governor Limit Increases You Should Know

Governor Limit Increases in SFDC Winter 17 Release

These updates require no action and are automatically granted in the release. Load more records, relate more objects in formulas, and generally increase the scale of your solutions without doing anything! https://releasenotes.docs.salesforce.com/en-us/winter17/release-notes

1. Process Twice as Many Records with Bulk API

You know what’s better than being able to upload 5,000 batches a day with Bulk API? If you guessed, “being able to upload 10,000 batches a day,” you’re right! The daily batch limit has been increased to 10,000 for all orgs.

2. Create More Spanning Relationships Per Object in Formulas

Sometimes you want more! We’ve increased the number of unique relationships per object from 10 to 15. This increase is available in both Lightning Experience and Salesforce Classic.

3. Daily Org Limits for Sending Emails with the API Have Increased

Using the Salesforce API or Apex, you can now send single emails to 5,000 external email addresses per day based on Greenwich Mean Time (GMT). You can also send mass email to 5,000 external email addresses per day per org. The maximum number of external email addresses is no longer based on your Salesforce edition. You can use your remaining daily balance of external email addresses in as many mass emails as you’d like, regardless of your edition. This feature is available in both Lightning Experience and Salesforce Classic.

4. Get More Days to Schedule Your Quick Deployments

The time window to quick-deploy your validations has expanded from 4 days to 10 days. This larger time window provides you more flexibility for scheduling your quick deployment and helps minimize the impact on your org.

5. Higher Limits for Standard Picklists

Standard, multi-select picklists can be as detailed as you need them to be with a new limit of 255 characters per entry. This feature is available in both Lightning Experience and Salesforce Classic.

6. Make More API Calls and Get Fewer Headaches When Calculating API Limits

We simplified the API request limit calculation and gave everyone more calls per 24-hour period. For Enterprise Edition, Unlimited Edition, Performance Edition, and Professional Edition with API access enabled, the old calculation was based on your number of licenses and the license types, with a guaranteed minimum of 15,000 calls per 24-hour period. We scrapped the minimum and gave everyone 15,000 more calls. The calculation for Developer Edition orgs and sandboxes remains the same.

Tuesday, May 24, 2016

Passing the Salesforce Certified Integration Architecture Exam

I recently passed the Salesforce Certified Integration Architecture Exam. The full exam details can be found at the SFDC Architect Academy. This is one of the 9 Architect Domain Specialist exams. In this post I'll be providing my general thoughts on the content of the exam. Here is the description of the exam from the Architect Academy:


Scenario Based Questions

The exam asks multiple scenario based questions where you need to choose the right integration pattern given a set of criteria. For example a question may look something like:

Delivery Express has a Salesforce.com Org that is holds all the customer information including Accounts and Opportunities. Delivery Express has a on-premise ERP system that is the order fulfillment system. The ERP System has a SOAP API for creating and updating orders and is offline occasionally for maintenance. Delivery Express has a new requirement whenever an Opportunity is Close-Won an order should be created in the ERP. The order needs to be near real time and custom development work is not an option.

Which technology should the Integration Architect select based on the above?

A. Outbound Messaging
B. Apex Callout
C. ETL Tool
D. Canvas App

The above question can be answered with A because Outbound Messaging can be setup custom development, can retry failed messages automatically, and supports SOAP API endpoints.

Know Your Integration Options

MashUps

You need to know when to use a mashup. If the requirement is to simply view the data in SFDC UI then you can look at doing Canvas or Custom Web Tab. Use a Canvas if you need authentication and Custom Web Tab if you need to surface an external website that is unauthenticated.

Middleware, ESB & ETL

You need to know when to use ETL and Middleware to solve your problem. Do you need complex error handling and orchestration across multiple systems? Then you're going to need to use an ESB, Middleware, or ETL tool. Do you have a data synchronization that does not need to be real time but handle large data volumes? Again an ETL tool using Bulk API is going to be your best bet.

Apex Web Services

You need to know when it makes sense to incur the technical debt to build out a Apex Web Service. Do you need to multiple DML statements on the SFDC Tier as an atomic operation? Than you need to look at using an Apex Web Service to handle the transaction as a single API call.

Apex Callouts

How do Apex Callouts work? You need to know that an Apex Callout cannot fire inside an Apex Trigger. A trigger has to invoke a @future method that in turns does the HTTP Callouts.

Bulk API 

You should know when and how to use the Bulk API. How can you monitor a Bulk API job? Through the UI and through API calls? What is the difference between Serial and Parallel modes? What API's support Serial and Parallel?

Avoid Record Lock Scenarios and Performance Issues

The answer is not to always do the work on the SFDC Tier. Many times you want to pre-process your data before you load it into SFDC. One example is to group your data by a parent record like Account so that Bulk API jobs do not fail due to parent record locking. Also for example if you are integration large volumes of data you do not want to do upsert you want to do separate insert and update calls respectively, forcing you to pre-process those ID's in your data set.

Want to learn even more? Join my team!

I am currently looking for the next set of Force.com Engineers to mentor. If this type of technical content and challenge seem appealing to you, and you want to become an all star Force.com Engineer across the entire Force.com PaaS Stack, reach out to me on Twitter or Linkedin. I am looking for several USA based engineering roles to join my engineering team at Fusion Risk Management (Sorry I cannot provide visa sponsorships ).