A company is planning on sending orders from Salesforce to a fulfillment system. The integration architect has been asked to plan for the integration. Which two questions should the integration architect consider?
Choose 2 answers
A.
Can the fulfillment system create new addresses within the Order Create service?
B.
Can the fulfillment system make a callback into Salesforce?
C.
Can the fulfillment system implement a contract-first Outbound Messaging interface?
D.
Is the product catalog data identical at all times in both systems?
Can the fulfillment system make a callback into Salesforce?
Is the product catalog data identical at all times in both systems?
Explanation:
Callbacks into Salesforce are critical if you plan to use asynchronous communication patterns like Outbound Messaging or Platform Events that require acknowledgment or updates. Meanwhile, ensuring product catalog consistency is essential for order accuracy. If catalogs are misaligned, users might create orders with invalid or outdated items. Questions about address creation and contract-first design are more implementation details and less about initial architectural feasibility.
A developer has been tasked by the integration architect to build a solution based on the Streaming API. The developer has done some research and has found there are different implementations of the events in Salesforce (Push Topic Events, Change Data Capture, Generic Streaming, Platform Events), but is unsure of to proceed with the implementation.The developer asks the system architect for some guidance. What should the architect consider when making the recommendation?
A.
Push Topic Event can define a custom payload.
B.
Change Data Capture does not have record access support.
C.
Change Data Capture can be published from Apex.
D.
Apex triggers can subscribe to Generic Events.
Change Data Capture does not have record access support.
Explanation:
The architect should note that:
→ CDC Limitations: Change Data Capture (B) doesn't support filtering by record access permissions (unlike PushTopics).
→ Platform Events (Not Listed): Would be better for custom event publishing.
PushTopics (A) allow query-based payloads but are legacy. The correct guidance depends on whether row-level security is needed for the streaming data.
A customer imports data from an external system into Salesforce using Bulk API. These jobs have batch sizes of 2000 and are run in parallel mode. The batc fails frequently with the error "Max CPU time exceeded". A smaller batch size will fix this error. Which two options should be considered when using a smaller batch size? Choose 2 answers
A.
Smaller batch size may cause record-locking errors.
B.
Smaller batch size may increase time required to execute bulk jobs.
C.
Smaller batch size may exceed the concurrent API request limits.
D.
Smaller batch size can trigger "Too many concurrent batches" error.
Smaller batch size may increase time required to execute bulk jobs.
Smaller batch size can trigger "Too many concurrent batches" error.
Explanation:
When using Salesforce Bulk API, large batch sizes can exceed CPU limits during processing—especially if triggers, flows, or validation rules are intensive. Reducing the batch size is a logical mitigation step, as smaller chunks reduce CPU time per execution unit. However, this increases the number of batches needed to complete the job.
More batches mean longer total execution time (B), since each batch must be queued, processed, and possibly retried. Additionally, Salesforce imposes limits on concurrent batches—typically 5 for synchronous and up to 100 for asynchronous jobs depending on org limits. Exceeding this results in “Too many concurrent batches” errors (D), halting or delaying processing.
While it’s tempting to reduce batch sizes drastically, it’s important to balance performance and limit thresholds. Options A and C are incorrect: smaller batch sizes reduce locking issues, and they don’t inherently violate concurrent API request limits, which are separate from batch execution concurrency.
Northern Trail Outfitters (NTO) has recently changed their Corporate Security Guidelines. The guidelines require that all cloud applications pass through a secure firewall before accessing on-premise resources. NTO is evaluating middleware solutions to integrate cloud applications with on-premise resources and services. What are two considerations an Integration Architect should evaluate before choosing a middleware solution?
Choose 2 answers
A.
The middleware solution is capable of establishing a secure API gateway between cloud applications and on-premise resources.
B.
An API gateway component is deployable behind a Demilitarized Zone (DMZ) or perimeter network.
C.
The middleware solution enforces the OAuth security protocol.
D.
The middleware solution is able to interface directly with databases via an ODBC connection string.
The middleware solution is capable of establishing a secure API gateway between cloud applications and on-premise resources.
An API gateway component is deployable behind a Demilitarized Zone (DMZ) or perimeter network.
Explanation:
When integrating Salesforce (a cloud platform) with on-premise resources, the architect must overcome challenges like firewall restrictions, network security, and data governance. Middleware becomes a bridge, often deployed in a DMZ to allow limited, controlled access from external systems while maintaining a strong internal security posture.
A key requirement is that the middleware can act as a secure API gateway (A)—this enables controlled exposure of internal services to Salesforce or other cloud platforms. The ability to deploy components of the middleware inside the DMZ (B) is critical. It enables routing or proxying of requests while ensuring that no direct access is granted to internal systems.
Options C and D are less critical or incorrect: OAuth (C) is typically used for user authentication, not always for middleware; and direct ODBC connections (D) from Salesforce via middleware are rarely recommended due to security and scalability issues.
Which WSDL should an architect consider when creating an integration that might be used for more than one salesforce organization and different met
A.
Corporate WSDL
B.
Partner WSDL
C.
SOAP API WSDL
D.
Enterprise WSDL
Partner WSDL
Explanation:
Salesforce offers two main WSDLs for SOAP integrations: Enterprise and Partner. The Enterprise WSDL is strongly typed and tightly coupled with a specific org’s metadata (custom objects, fields, etc.). This means it must be regenerated if metadata changes, and is not portable across orgs.
In contrast, the Partner WSDL is loosely typed and uses a more flexible schema. It represents objects and fields as generic name-value pairs (like sObject and fieldsToNull), which makes it ideal for cross-org integrations where metadata varies or changes frequently.
For ISVs or scenarios where the integration must be reusable across different Salesforce environments (e.g., dev, staging, production, or multiple clients), the Partner WSDL is the better choice. It’s also better suited for dynamic scenarios like schema discovery or integration with systems that don't maintain tight data models.
Thus, Partner WSDL provides maximum flexibility, making it the preferred option when metadata cannot be guaranteed to be identical across orgs.
A company's security assessment noted vulnerabilities on the un managed packages in their Salesforce orgs, notably secrets that are easily accessible and in plain text, such as usernames, passwords, and OAuth tokens used in callouts from Salesforce. Which two persistence mechanisms should an integration architect require to be used to ensure that secrets are protected from deliberate or inadvertent exposure?
Choose 2 answers
A.
Encrypted Custom Fields
B.
Named Credentials
C.
Protected Custom Metadata Types
D.
Protected Custom Settings
Named Credentials
Protected Custom Metadata Types
Explanation:
Salesforce provides multiple mechanisms to store secrets securely and avoid hardcoding sensitive data like OAuth tokens, API keys, and credentials. The best practice is to use Named Credentials (B), which securely store authentication settings (e.g., username/password, OAuth tokens) and abstract them from Apex code. This ensures secrets aren't exposed in code or config and simplifies endpoint management.
Protected Custom Metadata Types (C) allow you to store config data like endpoints, keys, or feature flags. Marking metadata records as "protected" ensures they are not visible outside managed packages, shielding secrets from org admins and preventing accidental exposure.
Encrypted Custom Fields (A) are better for storing secure business data (e.g., SSNs), not integration secrets. Protected Custom Settings (D) are legacy tools and lack the security enforcement of Protected Custom Metadata—making them less safe for storing secrets.
Using Named Credentials and Protected Metadata Types ensures compliance with secure coding practices and reduces risk from accidental disclosure or misuse of sensitive integration data.
An organization needs to integrate Salesforce with an external system and is considering authentication options. The organization already has implemented SAML, using a thirdparty Identity Provider for integrations between other systems. Which use case can leverage the existing SAML integration to connect Salesforce with other internal systems?
A.
Make formula fields with HYPERLINK() to external web servers more secure.
B.
Make Apex SOAP outbound integrations to external web services more secure.
C.
A Make Apex REST outbound integrations to external web services more secure.
D.
Make an API inbound integration from an external Java client more secure.
Make an API inbound integration from an external Java client more secure.
Explanation:
Salesforce supports SAML-based Single Sign-On (SSO) for both UI and API authentication. If a company already uses a SAML-compliant Identity Provider (IdP), this can be extended to secure inbound API connections. For example, a Java client integrating with Salesforce can authenticate users using SAML Bearer Assertion Flow. This allows the external system to obtain an access token for Salesforce using a previously authenticated SAML session, eliminating the need to store or transmit passwords.
Options B and C (Apex outbound calls) are not secured via SAML, as SAML is primarily for inbound user authentication—not for securing outbound REST/SOAP integrations from Salesforce.
Option A (HYPERLINK fields) isn't relevant to authentication at all.
So, option D is correct because it properly applies SAML for authenticating API requests coming into Salesforce, leveraging existing identity infrastructure and enhancing security by removing reliance on stored credentials.
Universal Containers (UC) owns a variety of cloud-based applications, including Salesforce, alongside several on premise applications. The on-premise applications are protected behind a corporate network with limited outside access to external systems. UC would like to expose data from the on-premise applications to Salesforce for a more unified user experience. The data should be accessible from Salesforce in real-time. Which two actions should be recommended to fulfill this system requirement?
Choose 2 answers
A.
Develop an application in Heroku that connects to the on-premise database via an ODBC string and VPC connection.
B.
Develop custom APIs on the company's network that are invokable by Salesforce.
C.
Deploy MuleSoft to the on-premise network and design externally facing APIs to expose the data.
D.
Run a batch job with an ETL tool from an on-premise server to move data to Salesforce.
Develop custom APIs on the company's network that are invokable by Salesforce.
Deploy MuleSoft to the on-premise network and design externally facing APIs to expose the data.
Explanation:
To achieve real-time data access from Salesforce to on-premise systems, the integration must overcome network barriers and ensure secure, low-latency access. Developing custom REST or SOAP APIs (B) on the internal network that Salesforce can invoke (via callouts) is a direct and flexible approach. These APIs should be exposed securely through a DMZ or API gateway.
Alternatively, MuleSoft (C) is an ideal middleware solution for hybrid integrations. When deployed on-premise, MuleSoft can bridge cloud-to-ground communication, managing authentication, transformation, and secure API exposure. It simplifies complex integration flows and allows centralized governance and error handling.
Option A (Heroku + ODBC) is overly complex and introduces unnecessary hops. Option D (batch ETL) does not support real-time use cases and contradicts the requirement for live access.
Therefore, the best strategy includes secure, API-based access with minimal latency, enabled either directly or through a robust integration platform like MuleSoft.
Only authorized users are allowed access to the EBS and the Enterprise DMS. Customers call Customer Support when they need clarification on their bills. Customer Support needs seamless access to customer billing information from the E and view generated bills from the DMS. Which three authorization and authentication needs should an integration consultant consider while integrating the DMS and ESB with Salesforce?
Choose 3 answers
A.
Users should be authorized to view information specific to the customer they are servicing without a need to search for customer.
B.
Identify options to maintain DMS and EBS authentication and authorization details in Salesforce.
C.
Consider Enterprise security needs for access to DMS and EBS.
D.
Consider options to migrate DMS and EBS into Salesforce.
E.
Users should be authenticated into DMS and EBS without having to enter username and password.
Users should be authorized to view information specific to the customer they are servicing without a need to search for customer.
Consider Enterprise security needs for access to DMS and EBS.
Users should be authenticated into DMS and EBS without having to enter username and password.
Explanation:
Integrating Salesforce with systems like Document Management Systems (DMS) and Enterprise Billing Systems (EBS) requires strong identity and access control. First, users should be able to view only data for the customer they are supporting (A)—this is critical for both data privacy and security compliance.
Enterprise-level security (C) must be accounted for, including firewall restrictions, audit logging, and encryption. The architecture should align with internal IT and security policies, especially when exposing sensitive customer billing and document data.
For seamless user experience, users should be authenticated into DMS and EBS without re-entering credentials (E). This is best achieved with SSO or token-based authentication (like SAML or JWT), which improves usability and security by avoiding password proliferation.
Option B (storing credentials in Salesforce) is discouraged due to security risks. Option D (migrating DMS/EBS into Salesforce) is impractical for most enterprises due to cost, complexity, and compliance limitations.
A healthcare services company maintains a Patient Prescriptions System that has 50+ million records in a secure database. Their customer base and data set growing rapidly.
They want to make sure that the following policies are enforced:
1. Identifiable patient prescriptions must exist only in their secure system's databaseand encrypted at rest.
2. Identifiable patient prescriptions must be made available only to people explicit authorized in the Patient Prescriptions System assigned nurses anddoctors, patient, and people explicitly the patient may authorize.
3. Must be available only to verified and pre-approved people or legal entities.
To enable this, the company provides the following capabilities:
1. One-time use identity tokens for patients, nurses, doctors, and other people that expire within a few minutes.
2. Certificates for legal entities.
. RESTful services.
The company has a Salesforce Community Cloud portal for patients, nurses, doctors, and other authorized people. A limited number of employees analyze de identified data in Einstein Analytics.
Which two capabilities should the integration architect require for the Community Cloud portal and Einstein Analytics?
Choose 2 answers
A.
Identity token data storage
B.
Bulk load for Einstein Analytics
C.
Callouts to RESTful services
D.
Encryption in transit and at rest
Callouts to RESTful services
Encryption in transit and at rest
Explanation:
To meet strict data privacy requirements for medical data, integrations must ensure security and compliance. The Community Cloud portal (now called Experience Cloud) must retrieve sensitive prescription data via secure RESTful APIs (C). These APIs are built to support token-based access control (e.g., JWT, one-time-use tokens) and ensure that only authorized users can access specific data.
In addition, encryption in transit and at rest (D) is a regulatory and ethical requirement. TLS/SSL ensures data is secure during transmission, while database-level or platform encryption protects stored data from unauthorized access. Einstein Analytics (now Tableau CRM) must also adhere to these policies if it handles even de-identified data.
Option A (storing identity tokens) is incorrect, as tokens are transient and should not be persisted. Option B (bulk loading into analytics) violates the real-time, secure, and minimal retention principles defined in the scenario.
Northern Trail Outfitters' (NTO) Salesforce org usually goes through 8k-10k batches a day to synch data from external sources. NTO's Integration Architec has received requirements for a new custom object, FooBarc, for which 90M records will need to be loaded into the org. Once complete, 20GB (about 30M records) needs to be extracted to an external auditing system. What should the architect recommend using to meet these requirements in a day?
A.
Insert using Bulk API 2.0 and query using REST API.
B.
Insert and query using Bulk API 1.0.
C.
Insert using Bulk API 1.0 and query using REST API.
D.
Insert and query using Bulk API 2.0.
Insert and query using Bulk API 2.0.
Explanation:
Salesforce Bulk API 2.0 is optimized for large-scale data operations with simplified job management and better performance over Bulk API 1.0. It automatically handles batching and concurrency behind the scenes. For inserting 90 million records in a day, Bulk API 2.0 is the best-suited tool due to its ability to parallelize jobs and manage throughput automatically.
Moreover, querying 30 million records for extraction (e.g., audit system) is also more efficient with Bulk API 2.0, which supports PK chunking and handles large result sets better than the REST API. REST API isn't built for high-volume querying—it’s better for transactional data.
Bulk API 1.0, while still valid, requires manual batch splitting and doesn’t scale as effectively. Big Objects (used in D) are for long-term storage of non-transactional data—not for high-volume daily inserts and extracts.
Thus, Bulk API 2.0 is the correct tool for both insertion and extraction at this scale, while simplifying operations and improving reliability.
Customer is evaluating Platform Events solution and would like help in comparing/contrasting it with Outbound Message for a real-time / near-real time needs. They expect 3,000 consumers of messages from Salesforce. Which three considerations should be evaluated and highlighted when deciding between the solutions?
Choose 3 answers
A.
Both Platform Events and Outbound Message offer declarative means for asynchronous near-real time needs. They aren't best suited for realtime integrations.
B.
In both Platform Events and Outbound Messages, the event messages are retried by and delivered in sequence, and only once. Salesforce ensures there is no duplicate message delivery.
C.
Message sequence is possible in Outbound Message but not guaranteed with Platform Events. Both offer very high reliability. Fault handling and recovery are fully handled by Salesforce.
D.
Number of concurrent subscribers to Platform Events is capped at 2,000. An Outbound Message configuration can pass only 100 notifications in a single messages to a SOAP end point.
E.
Both Platform Events and Outbound Message are highly scalable. However, unlike Outbound Message, only Platform Events have Event Delivery and Event Publishing limits to be considered.
Both Platform Events and Outbound Message offer declarative means for asynchronous near-real time needs. They aren't best suited for realtime integrations.
Number of concurrent subscribers to Platform Events is capped at 2,000. An Outbound Message configuration can pass only 100 notifications in a single messages to a SOAP end point.
Both Platform Events and Outbound Message are highly scalable. However, unlike Outbound Message, only Platform Events have Event Delivery and Event Publishing limits to be considered.
Explanation:
When comparing Platform Events and Outbound Messaging, it’s important to evaluate scale, reliability, and delivery characteristics. Both are asynchronous and near-real-time mechanisms and can be triggered declaratively (Apex, Workflow, or Process Builder).
Platform Events (PE) support multiple subscribers (up to 2,000), while Outbound Messages (OM) only target a single endpoint per configuration and support up to 100 messages per call (D). PE is more flexible for event-driven, publish-subscribe patterns across many consumers.
However, PE has strict event publishing and delivery limits (E), which can throttle high-throughput use cases. Outbound Messaging doesn't have such publish limits but lacks the scalability and retry customization offered by PE.
Option B is false — PE doesn’t guarantee message order or uniqueness, and messages may be delivered more than once. Option C is also incorrect — message sequencing is not guaranteed in OM and reliability varies.
Page 3 out of 9 Pages |
Previous |