Universal Containers (UC) has built a custom application on Salesforce to help track shipments around the world. A majority of the shipping records are stored on premise in an external data source. UC needs shipment details to be exposed to the custom application, and the data needs to be accessible in real time. The external data source is not OData enabled, and UC does not own a middleware tool. Which Salesforce Connect procedure should a data architect use to ensure UC's requirements are met?
A.
Write an Apex class that makes a REST callout to the external API.
B.
Develop a process that calls an inviable web service method.
C.
Migrate the data to Heroku and register Postgres as a data source.
D.
Write a custom adapter with the Apex Connector Framework.
Write a custom adapter with the Apex Connector Framework.
Explanation:
The Apex Connector Framework enables building custom adapters for Salesforce Connect, creating virtual "external objects" that mirror real-time external data without replication. This meets UC’s real-time requirement for non-OData systems. Option A (REST callout) requires custom Apex but doesn’t integrate natively as objects. Option B is vague and impractical. Option C (Heroku) violates the "external data source" mandate by requiring migration. A custom adapter (D) uses Apex to translate external API responses into Salesforce-readable formats via OData 4.0 proxies. Shipment data appears as external objects in Salesforce, accessible via SOQL, reports, and layouts – solving real-time visibility without middleware.
Universal Container (UC) has accumulated data over years and has never deleted data from its Salesforce org. UC is now exceeding the storage allocations in the org. UC is now looking for option to delete unused from the org. Which three recommendations should a data architect make is order to reduce the number of records from the org? Choose 3 answers
A.
Use hard delete in Bulk API to permanently delete records from Salesforce.
B.
Use hard delete in batch Apex to permanently delete records from Salesforce.
C.
Identify records in objects that have not been modified or used In last 3 years.
D.
Use Rest API to permanently delete records from the Salesforce org.
E.
Archive the records in enterprise data warehouse (EDW) before deleting from Salesforce.
Use hard delete in Bulk API to permanently delete records from Salesforce.
Identify records in objects that have not been modified or used In last 3 years.
Archive the records in enterprise data warehouse (EDW) before deleting from Salesforce.
Explanation:
A: Bulk API hard-deletes bypass the Recycle Bin, permanently removing records at scale (up to 10k records/job). This reclaims storage instantly.
C: Identifying unused records (e.g., via SOQL LastModifiedDate) targets deletion efforts efficiently. Objects like Tasks, old Cases, or Logs are prime candidates.
E: Archiving to an EDW preserves compliance for 5-year policies before irreversible deletion.
Excluded Options:
B: Batch Apex risks governor limits (e.g., 10k DML/transaction) and is less efficient than Bulk API.
D: REST API has lower throughput than Bulk API for mass deletion.
Archiving (E) is critical: Salesforce storage is expensive, while cloud EDWs (e.g., Snowflake) offer cheaper long-term retention.
North Trail Outfitters (NTO) operates a majority of its business from a central Salesforce org, NTO also owns several secondary orgs that the service, finance, and marketing teams work out of, At the moment, there is no integration between central and secondary orgs, leading to data-visibility issues. Moving forward, NTO has identified that a hub-and-spoke model is the proper architect to manage its data, where the central org is the hub and the secondary orgs are the spokes. Which tool should a data architect use to orchestrate data between the hub org and spoke orgs?
A.
A middleware solution that extracts and distributes data across both the hub and spokes.
B.
Develop custom APIs to poll the hub org for change data and push into the spoke orgs.
C.
Develop custom APIs to poll the spoke for change data and push into the org.
D.
A backup and archive solution that extracts and restores data across orgs.
A middleware solution that extracts and distributes data across both the hub and spokes.
Explanation:
Middleware (e.g., MuleSoft, Informatica) centralizes bidirectional data orchestration in hub-spoke architectures. It polls the hub for changes and pushes delta updates to spokes (and vice versa) in near-real-time. Custom APIs (B/C) require building polling logic, error handling, and security – creating maintenance overhead. Option B’s "poll hub → push spokes" misses spoke-to-hub syncs. Option C inverts this incorrectly. Backup tools (D) copy data but don’t synchronize live changes. Middleware handles conflict resolution, logging, and scalability across orgs, making it the only sustainable solution for ongoing data visibility.
Get Cloudy Consulting monitors 15,000 servers, and these servers automatically record their status every 10 minutes. Because of company policy, these status reports must be maintained for 5 years. Managers at Get Cloudy Consulting need access to up to one week's worth of these status reports with all of their details. An Architect is recommending what data should be integrated into Salesforce and for how long it should be stored in Salesforce. Which two limits should the Architect be aware of? (Choose two.)
A.
Data storage limits
B.
Workflow rule limits
C.
API Request limits
D.
Webservice callout limits
Data storage limits
API Request limits
Explanation:
A. Data storage limits: Storing 5 years’ worth of status updates for 15,000 servers would easily exceed Salesforce storage limits.
C. API request limits: Continuous and frequent updates (every 10 minutes) could breach daily API request limits.
Incorrect Options:
B. Workflow rule limits: Irrelevant unless using workflows, which is not stated.
D. Webservice callout limits: Applies when Salesforce initiates outbound calls, not the scenario here.
UC has one SF org (Org A) and recently acquired a secondary company with its own Salesforce org (Org B). UC has decided to keep the orgs running separately but would like to bidirectionally share opportunities between the orgs in near-real time. Which 3 options should a data architect recommend to share data between Org A and Org B? Choose 3 answers.
A.
Leverage Heroku Connect and Heroku Postgres to bidirectionally sync Opportunities.
B.
Install a 3rd party AppExchange tool to handle the data sharing
C.
Develop an Apex class that pushes opportunity data between orgs daily via the Apex schedule.
D.
Leverage middleware tools to bidirectionally send Opportunity data across orgs.
E.
Use Salesforce Connect and the cross-org adapter to visualize Opportunities into external objects
Leverage Heroku Connect and Heroku Postgres to bidirectionally sync Opportunities.
Install a 3rd party AppExchange tool to handle the data sharing
Leverage middleware tools to bidirectionally send Opportunity data across orgs.
Explanation:
A. Heroku Connect: Syncs data in near real-time using Heroku Postgres and Connect.
B. AppExchange tool: Pre-built solutions can offer low-code integration capabilities.
D. Middleware tools: Robust and scalable for real-time and bidirectional syncing.
Incorrect Options:
C. Scheduled Apex: Not near real-time and lacks error handling and flexibility.
E. Salesforce Connect: Best for real-time viewing, not syncing or modifying data.
Universal Containers (UC) has an open sharing model for its Salesforce users to allow all its Salesforce internal users to edit all contacts, regardless of who owns the contact. However, UC management wants to allow only the owner of a contact record to delete that contact. If a user does not own the contact, then the user should not be allowed to delete the record. How should the architect approach the project so that the requirements are met?
A.
Create a "before delete" trigger to check if the current user is not the owner.
B.
Set the Sharing settings as Public Read Only for the Contact object.
C.
Set the profile of the users to remove delete permission from the Contact object.
D.
Create a validation rule on the Contact object to check if the current user is not the owner.
Create a "before delete" trigger to check if the current user is not the owner.
Explanation:
Triggers can enforce complex logic such as permission overrides based on ownership. A "before delete" trigger can check the current user's ID against the record owner and prevent deletion accordingly, satisfying the requirement precisely.
Incorrect Options:
B. Public Read Only: Prevents editing, not deletion. Also contradicts the open sharing model.
C. Profile-based delete restriction: Applies universally, not conditionally by owner.
D. Validation rule: Validation rules do not fire on delete events.
Universal Containers wishes to maintain Lead data from Leads even after they are deleted and cleared from the Recycle Bin. What approach should be implemented to achieve this solution?
A.
Use a Lead standard report and filter on the IsDeleted standard field.
B.
Use a Converted Lead report to display data on Leads that have been deleted.
C.
Query Salesforce with the queryAll API method or using the ALL ROWS SOQL keywords.
D.
Send data to a Data Warehouse and mark Leads as deleted in that system.
Send data to a Data Warehouse and mark Leads as deleted in that system.
Explanation:
Leads deleted and purged from the Recycle Bin are permanently unrecoverable in Salesforce.
Options A-C fail:
1. Reports (A/B) can’t access purged records.
2. queryAll (C) retrieves records from Recycle Bin but not after purge.
Archiving Leads to a data warehouse (D) preserves historical data. Automation (e.g., trigger) can flag Leads as "deleted" in the EDW during Salesforce deletion. This complies with data retention policies without consuming Salesforce storage.
Universal Containers (UC) is implementing Salesforce and will be using Salesforce to track customer complaints, provide white papers on products, and provide subscription based support. Which license type will UC users need to fulfill UC's requirements?
A.
Sales Cloud License
B.
Lightning Platform Starter License
C.
Service Cloud License
D.
Salesforce License
Service Cloud License
Explanation:
UC’s needs include tracking complaints (cases), offering white papers (knowledge base), and managing subscription-based support, all of which are covered under the Service Cloud license. It includes features such as Cases, Knowledge, and Entitlements.
Incorrect Options:
A. Sales Cloud: Focused on Leads, Opportunities—not sufficient for service use cases.
B. Lightning Platform Starter: Very limited functionality, not intended for service support.
D. Salesforce License: Generic term, not a specific license type.
Universal Containers (UC) requires 2 years of customer related cases to be available on SF for operational reporting. Any cases older than 2 years and upto 7 years need to be available on demand to the Service agents. UC creates 5 million cases per yr. Which 2 data archiving strategies should a data architect recommend? Choose 2 options:
A.
Use custom objects for cases older than 2 years and use nightly batch to move them.
B.
Sync cases older than 2 years to an external database, and provide access to Service agents to the database
C.
Use Big objects for cases older than 2 years, and use nightly batch to move them.
D.
Use Heroku and external objects to display cases older than 2 years and bulk API to hard delete from Salesforce.
Use Big objects for cases older than 2 years, and use nightly batch to move them.
Use Heroku and external objects to display cases older than 2 years and bulk API to hard delete from Salesforce.
Explanation:
✅ C. Use Big objects for cases older than 2 years, and use nightly batch to move them.
Big Objects are designed to handle massive amounts of data that do not need to be accessed frequently, which makes them ideal for storing historical data like cases older than 2 years. They support standard querying via SOQL with some limitations and are cost-effective for long-term storage. A nightly batch job ensures that eligible data is moved regularly.
✅ D. Use Heroku and external objects to display cases older than 2 years and bulk API to hard delete from Salesforce.
Heroku with external objects (via Salesforce Connect) is a good strategy for providing on-demand access to historical data that is stored outside Salesforce. This method maintains Salesforce data volume limits and performance, and Bulk API can be used to delete old records after they’ve been archived externally.
❌ A. Use custom objects for cases older than 2 years and use nightly batch to move them.
This increases storage usage in Salesforce and does not significantly reduce org size. It also lacks the querying performance benefits of Big Objects or external systems.
❌ B. Sync cases older than 2 years to an external database, and provide access to Service agents to the database
While viable in concept, this lacks seamless integration within the Salesforce UI. Service agents would need to leave Salesforce to access case data, which hurts productivity.
Universal Containers has two systems. Salesforce and an on -premise ERP system. An architect has been tasked with copying Opportunity records to the ERP once they reach a Closed/Won Stage. The Opportunity record in the ERP system will be read-only for all fields copied in from Salesforce. What is the optimal real-time approach that achieves this solution?
A.
Implement a Master Data Management system to determine system of record.
B.
Implement a workflow rule that sends Opportunity data through Outbound Messaging.
C.
Have the ERP poll Salesforce nightly and bring in the desired Opportunities.
D.
Implement an hourly integration to send Salesforce Opportunities to the ERP system.
Implement a workflow rule that sends Opportunity data through Outbound Messaging.
Explanation:
✅ B. Implement a workflow rule that sends Opportunity data through Outbound Messaging.
Outbound Messaging is a native point-and-click feature that supports real-time integration (or near real-time) without requiring Apex code. It’s ideal for one-way data transfers like copying Closed/Won Opportunities to a read-only ERP system.
❌ A. Implement a Master Data Management system
MDM is overkill for this use case. It adds unnecessary complexity when Salesforce is clearly the system of record for Opportunities.
❌ C. Have the ERP poll Salesforce nightly
Polling is not real-time and is resource inefficient. It can also miss near-term updates or cause synchronization delays.
❌ D. Implement an hourly integration
An hourly schedule is not considered "real-time". Outbound Messaging provides immediate updates, which is the core requirement here.
Universal Containers (UC) has a custom discount request object set as a detail object with a custom product object as the master. There is a requirement to allow the creation of generic discount requests without the custom product object as its master record. What solution should an Architect recommend to UC?
A.
Mandate the selection of a custom product for each discount request.
B.
Create a placeholder product record for the generic discount request.
C.
Remove the master-detail relationship and keep the objects separate.
D.
Change the master-detail relationship to a lookup relationship.
Change the master-detail relationship to a lookup relationship.
Explanation:
✅ D. Change the master-detail relationship to a lookup relationship.
Master-detail relationships require the detail record to have a parent. To allow creation of standalone discount requests, a lookup relationship is appropriate. It allows flexibility—linking to a custom product when applicable and remaining unlinked otherwise.
❌ A. Mandate the selection of a custom product
This violates the requirement for "generic" discount requests which must exist without a product.
❌ B. Create a placeholder product record
This is a workaround and introduces unnecessary data just to satisfy a structural constraint.
❌ C. Remove the master-detail relationship and keep the objects separate
This breaks the existing data model. A lookup allows partial detachment without redesigning object relationships completely.
Universal Containers wants to develop a dashboard in Salesforce that will allow Sales Managers to do data exploration using their mobile device (i.e., drill down into sales-related data) and have the possibility of adding ad-hoc filters while on the move. What is a recommended solution for building data exploration dashboards in Salesforce?
A.
Create a Dashboard in an external reporting tool, export data to the tool, and add link to the dashboard in Salesforce.
B.
Create a Dashboard in an external reporting tool, export data to the tool, and embed the dashboard in Salesforce using the Canval toolkit.
C.
Create a standard Salesforce Dashboard and connect it to reports with the appropriate filters.
D.
Create a Dashboard using Analytics Cloud that will allow the user to create ad-hoc lenses and drill down.
Create a Dashboard using Analytics Cloud that will allow the user to create ad-hoc lenses and drill down.
Explanation:
✅ D. Create a Dashboard using Analytics Cloud
Analytics Cloud (Tableau CRM) offers advanced features like mobile-optimized dashboards, drill-downs, ad-hoc filters, and interactive lenses. It is purpose-built for data exploration and supports offline capabilities.
❌ A & B. Use external reporting tools
These approaches involve data duplication, security management, and external logins. Not ideal for ad-hoc, mobile-first interaction.
❌ C. Standard Salesforce Dashboard
Standard dashboards are limited in interactivity and filtering on mobile. They are better for static reporting than exploration.
Page 7 out of 22 Pages |
Previous |