An administrator needs to create a junction object called Account Region to link the
standard Account object with a custom object called Region.
Once the junction object is created, what are the next two steps the administrator should
take?
Choose 2 answers
A. Make a master-detail relationship field on the junction object to the Region object.
B. Build a master-detail relationship field on the Region object to the junction object.
C. Create a master-detail relationship field on the Account object to the junction object.
D. Configure a master-detail relationship field on the junction object to the Account object.
Explanation:
To create a many-to-many relationship using a junction object, you must establish two master-detail relationships from the junction object to the two "master" objects you want to link. In this case, the junction object is Account Region, and the master objects are the standard Account object and the custom Region object.
A. Make a master-detail relationship field on the junction object to the Region object: This is one of the two necessary master-detail relationships. It establishes the connection from the Account Region junction object to the Region object.
D. Configure a master-detail relationship field on the junction object to the Account object: This is the second and equally necessary master-detail relationship. It establishes the connection from the Account Region junction object to the Account object.
Why other options are incorrect
B. Build a master-detail relationship field on the Region object to the junction object: Master-detail relationships are always created from the "detail" (or child) object to the "master" (or parent) object. The junction object is the child in this scenario, so the relationship must be defined on the junction object, not the Region object.
C. Create a master-detail relationship field on the Account object to the junction object: Similar to option B, this is backward. The Account object is a master, not the child, so the relationship must originate from the junction object.
Reference
Salesforce Help: Create a Many-to-Many Object Relationship: This article describes the process, stating, "Create a custom object to serve as the junction object. This object needs to have two master-detail relationships...".
Salesforce Ben: What is a Junction Object in Salesforce?: This resource explicitly states, "We now need to create our two master-detail relationships on the Job Application object... one will need to connect to our Candidate object, the other to Job Position." This confirms that the relationships are defined on the junction object.
Bottom Line
To form a many-to-many relationship, the junction object must have two master-detail fields, each pointing to one of the two objects being linked. The relationship is always defined on the child (detail) object, which in this case is the junction object, Account Region.
Soles reps at AW Computing hove asked the Administrator to help them close deals faster
on the Salesforce mobile app when they're in the new. They want to be able to quickly close an opportunity and have key fields, like status, pre populated to Closed Won.
What should an administrator create to achieve this?
A. Object-specific Quick Action
B. Global Quick Action
C. Lightning Component
D. Enhanced Related Lists
Explanation:
The requirement is to quickly close an Opportunity and have key fields pre-populated to "Closed Won."
An Object-Specific Quick Action (created on the Opportunity object) is the perfect solution for this use case.
When defining an Update Record action, the administrator can specify predefined field values.
In this case, the admin would pre-set the Stage field to "Closed Won" and the Probability field to "100%".
When the sales rep clicks this action, a simplified layout immediately appears with the pre-populated values, allowing them to quickly confirm the close and save, drastically speeding up the process, especially on the Salesforce mobile app.
Why the Other Options are Incorrect
B. Global Quick Action:
Global actions are not tied to a specific object, so they can't pre-populate fields on an Opportunity based on the context of viewing that specific Opportunity record. Global actions are better for creating records (like a new Lead) from anywhere.
C. Lightning Component:
A custom Lightning Component could certainly be built to perform this action, but the requirement can be met entirely with a declarative tool (Quick Action). In the Advanced Administrator exam, the best answer is almost always the most efficient, declarative solution.
D. Enhanced Related Lists:
This feature only provides more options for viewing and interacting with child records (like Contacts or Activities) on the record detail page. It has nothing to do with creating or updating the fields of the parent Opportunity record itself.
Northern Trail Outfitters has many users set up as system administrators to perform
Salesforce Administration.
Which two functions would a delegated administrator be able to perform in order to help the
existing Salesforce Administrator?
Choose 2 answers
A. Set up users and password management.
B. Configure updates to sharing rules.
C. Manage custom objects and customize nearly every aspect.
D. Make updates to permission set configurations.
Explanation:
Delegated Administration in Salesforce allows you to assign limited administrative privileges to trusted users without giving them full system administrator access. This helps distribute admin responsibilities while maintaining control over sensitive configurations.
Here’s what delegated administrators can do:
A. Set up users and password management ✅
They can create and manage users within specific roles or profiles, reset passwords, and unlock users.
D. Make updates to permission set configurations ✅
Delegated admins can assign and remove permission sets for users, helping manage access without modifying profiles directly.
❌ Why the other options are incorrect:
B. Configure updates to sharing rules ❌
Sharing rules are part of security and access control, which delegated admins cannot modify.
C. Manage custom objects and customize nearly every aspect ❌
Delegated admins have limited customization rights. They cannot manage custom objects or make broad system changes.
🔗 Reference:
Salesforce Help: Delegated Administration Overview
Salesforce Trailhead: User Management
A developer is getting errors for Production deployment. The test deployment in the Full
sandbox, which included a local test run, was successful. The Full sandbox was last
refreshed 2 weeks ago.
Where should the administrator check to see what was recently changed?
A. Salesforce Optimizer
B. Dev Console
C. Field History
D. Setup Audit Trail
Explanation:
Why D is correct?
Setup Audit Trail lists recent configuration and metadata changes (who changed what, and when) across the org—things like fields, validation rules, flows, permission sets, page layouts, Apex classes activation, etc. Since production is failing but the full sandbox (refreshed 2 weeks ago) deployed fine, Audit Trail helps you spot recent production-only changes introduced after the sandbox refresh that could be breaking the deployment (e.g., a renamed field, a deleted dependency, a modified validation rule, or flow).
Why the others are incorrect?
A. Salesforce Optimizer – Provides recommendations and usage insights, not a chronological log of who changed what that would explain a new deployment failure.
B. Dev Console – Useful for editing/testing code, logs, and queries, but it doesn’t give you a consolidated history of setup changes made in production.
C. Field History – Tracks data changes on records (e.g., Account.Name changed), not metadata/config changes that impact deployments.
Tip:
After checking Audit Trail in Production, compare with the sandbox to identify deltas since the last refresh; fix or align metadata, then redeploy.
Cloud Kicks (CK) completed a project in a sandbox environment and wants to migrate the changes to production. CK split the deployment into two distinct change sets. Change set 1
has new custom objects and fields. Change set 2 has updated profiles and automation.
What should the administrator consider before deploying the change sets?
A. The Field-Level Security will not be deployed with the profiles in change set 2.
B. Change set 2 needs to be deployed first.
C. Automations need to be deployed in the same change set in order to be activated.
D. Both change sets must be deployed simultaneously.
Explanation:
Why A is correct:
When deploying changes using change sets, the Field-Level Security (FLS) for new fields is not automatically included with the profiles. You must explicitly include the profiles in the same change set as the new objects and fields in order to deploy the FLS settings for those new fields. Change set 2, which contains the profiles, will not have the context of the new custom objects and fields from change set 1. Therefore, when change set 2 is deployed, the profile updates will not include the FLS for the new fields, potentially leaving the new fields invisible or inaccessible to users. This is a common pitfall of splitting deployments in this manner.
Why other options are incorrect
B. Change set 2 needs to be deployed first:
This is incorrect. Deploying change set 2 (with profiles and automation) before change set 1 (with custom objects and fields) would cause the deployment to fail. The profiles and automation in change set 2 are dependent on the custom objects and fields in change set 1. Dependent components must exist in the target org or be included in the same change set for a successful deployment.
C. Automations need to be deployed in the same change set in order to be activated:
This is not always true. While it's best practice to keep related components together, an automation (like a flow) can be deployed in a separate change set from its dependencies as long as the dependencies are already in the target org. Furthermore, a flow is deployed as inactive and must be manually activated after deployment, regardless of whether it's in the same change set as other components.
D. Both change sets must be deployed simultaneously:
This is not a technical requirement and is generally not possible with Salesforce change sets. They must be deployed sequentially. The proper deployment order is to deploy the change set with the base components (like objects and fields) first, followed by dependent components (like profiles and automation).
Study Tips
Visualize Dependency Chains: Draw a diagram of your deployment components and their dependencies. In this case, profiles and automation in change set 2 depend on the objects and fields in change set 1. This visualization will help you understand the correct deployment order.
Practice with Change Sets: The best way to understand the quirks of change set deployments is to practice them in a sandbox. Create a few custom fields and a new profile, and try deploying them in different change set combinations to see how the FLS settings behave.
Know the Rules for Profiles: Remember that profiles in change sets only deploy permissions for the components that are also included in that specific change set. This is a critical point that can lead to unexpected permission problems after deployment.
Bottom Line
To successfully deploy metadata with dependencies using change sets, an administrator must ensure that all components are deployed in the correct order, with core components (custom objects and fields) being deployed before or together with dependent components (profiles, automation). The FLS for new fields must be explicitly included in the same change set as the fields themselves by including the relevant profiles.
Cloud Kicks has an export of Order and Order Item data from an enterprise resource
planning (ERP) system. The data must be imported into the Salesforce Order and Order
Product objects, while maintaining the relationships in the data.
What are two ways the administrator should load the data?
Choose 2 answers
A. Use an Upsert operation to load data.
B. Use an Insert operation to load data.
C. Replace the Salesforce record ID with the External ID.
D. Map an External ID data value to the object.
Explanation:
This scenario requires a two-step import process that uses External IDs to maintain the parent-child relationship between Order (Parent) and Order Product (Child).
Correct Answer 1: B. Use an Insert operation to load data.
Since the data is being loaded for the first time from an external ERP system, the appropriate operation is Insert. The administrator must follow this sequence:
Insert the Parent Records: Use the Insert operation to load all the Order records first, ensuring the ERP's unique ID is mapped to the Salesforce External ID field (as described in Option D).
Insert the Child Records: Use the Insert operation again to load the Order Product records, utilizing the External ID mapping (from the child record's source data to the parent's lookup field) to link them to the correct Order records.
Correct Answer 2: D. Map an External ID data value to the object.
To establish the link between the ERP system and Salesforce, the administrator must first ensure the Order object has a custom field marked as an External ID (e.g., ERP_Order_ID__c).
Preparation: Create this custom External ID field on the Order object.
Import: When importing the Order Product (child) records, the administrator must map the unique ERP Order ID from the source file to the Order Lookup Field on the Order Product object.
Relationship: Salesforce uses the value in the ERP Order ID field on the child record to search the Order object's External ID field, finds the newly created Salesforce Order record, and automatically establishes the lookup relationship.
Why the Other Options are Incorrect
A. Use an Upsert operation to load data: Upsert is used to update existing records or insert new ones based on an External ID match. While it uses External IDs, Insert is the required operation for a first-time load of brand new data. Option B is a more fundamental step in this import.
C. Replace the Salesforce record ID with the External ID: The Salesforce Record ID is a unique, system-generated, immutable identifier. It cannot be replaced or overwritten. The administrator must map the external ID to a new custom field marked as an External ID, not the standard Salesforce ID.
Which three fields should be used as filter criteria? Choose 3 answers
A. A phone field that provides the full phone number of the seller.
B. A multi-select picklist field that designates features of the listing.
C. A number field that designates the square footage of the listing.
D. A formula field that calculates a price for the listing.
E. A picklist field that designates the county of the listing.
Explanation:
When choosing fields for filter criteria in reports, list views, or automation, you want fields that are:
Structured and searchable
Consistently populated
Useful for segmenting or narrowing down data
Here’s why the selected fields work well:
B. Multi-select picklist field (features of the listing)
Allows filtering based on one or more selected features (e.g., pool, garage, garden).
Useful for narrowing listings by amenities or characteristics.
C. Number field (square footage)
Numeric fields are ideal for range-based filters (e.g., greater than 1000 sq ft).
Helps users find listings that meet size requirements.
E. Picklist field (county of the listing)
Picklists offer standardized values, making them perfect for filtering by location.
Ensures consistent segmentation across geographic areas.
❌ Why the other options are less suitable:
A. Phone field (seller’s phone number)
Not useful for filtering — typically unique per record and not meaningful for segmentation.
D. Formula field (calculated price)
Formula fields can be used in filters, but they may not be as reliable or performant depending on complexity and dependencies.
Also, filtering by calculated values can be less intuitive than using base fields.
The administrator at Cloud Kicks created a flow in a sandbox that walks service agents
through the Return Merchandise Authorization creation process. The administrator
deployed the flow to production with a Change Set. Users are unable to use the flow in
production.
Which step should the administrator take?
Activate the flow administrator take?
A. Activate the flow manually after deployment.
B. Include the active and prior inactive flow version in the Change Set.
C. Ensure there is an active flow version in the sandbox.
D. Deployment the flow, with the Metadata API instead of Change Sets
Explanation:
Why A is correct:
This is the standard and required procedure. When a flow is deployed to production via a change set, it is deployed in an inactive state, regardless of its activation status in the source sandbox. This is a safety feature to prevent an untested or misconfigured flow from automatically running in the production environment. The administrator must manually log into production, navigate to the Flow setup menu, and activate the flow before users can execute it.
Why B is incorrect:
Including prior versions in the change set is unnecessary and does not solve the problem. The activation status is not carried over via the change set. Only the flow's definition and its versions are deployed; the "active" flag is always reset to false upon deployment.
Why C is incorrect:
While it is a best practice to test and activate a flow in the sandbox to ensure it works, this has no bearing on its status after deployment. Even if the flow is active in the sandbox, it will still be deployed as inactive to production. The root cause of the issue is the post-deployment activation step in production, not the sandbox's state.
Why D is incorrect:
The tool used for deployment is not the issue. Whether you use a Change Set, the Metadata API, DevOps Center, or any other tool, the behavior is the same: flows are deployed in an inactive state. Switching tools will not solve this problem.
Reference:
Salesforce Help: Deploy Flows - This document explicitly states: "When you deploy an active flow to another org, the flow is inactive after deployment. Activate the flow in the new org after you verify that it works as expected."
Ursa Major Solar allows its scientists to log new stars as they find them, but on occasion,
they log the same star by mistake. The administrator wants scientists to be notified when a
record is deleted and by whom, and to maintain their own discovery information.
What automation solution should be used to send the notification?
A. Heroku
B. Process Builder
C. Workflow Action
D. flow
Explanation:
To notify scientists when a record is deleted and track who deleted it, the best automation tool is Flow, specifically a Record-Triggered Flow configured for Delete events.
Salesforce Flows allow you to:
Trigger automation when a record is deleted — something that Workflow Rules and Process Builder cannot do.
Capture the user who performed the deletion using the $User global variable.
Send custom notifications or emails to the relevant scientist.
Preserve or transfer discovery information before deletion, if needed.
❌ Why the other options are incorrect:
A. Heroku
Heroku is a cloud platform for building apps, not a native Salesforce automation tool. It’s not necessary for this use case.
B. Process Builder
Process Builder does not support delete triggers, so it cannot respond to record deletions.
C. Workflow Action
Workflow Rules are limited to create and update events. They cannot trigger on deletions or send notifications based on who deleted a record.
🔗 Reference:
Salesforce Help: Record-Triggered Flows
Salesforce Trailhead: Automate with Flow
An administrator is planning he release process for the year. The team will be using
change sets to process deployment to production.
Which three best practices should be considered?
A. Plan your deployments around the production and sandbox maintenance schedules.
B. Use matching names for global publisher layouts and Outlook publisher layouts.
C. Be sure to test only after business hours the data after deployment.
D. Make sure to deploy all dependent components.
E. Make sure change sets are limited to 10,000 files.
Explanation:
A. Plan your deployments around the production and sandbox maintenance schedules.
This is a critical best practice. Salesforce has scheduled maintenance and release windows for both sandboxes and production environments.
Sandbox Refreshes: You should not plan a deployment right before a sandbox refresh, as any work you are depending on will be lost.
Production Maintenance: Deployments during scheduled maintenance could fail, be delayed, or conflict with system upgrades. Planning around these windows ensures a smooth deployment process.
D. Make sure to deploy all dependent components.
A change set requires all components it relies on to be included, unless those components already exist in the target (Production) environment in the required state.
If you deploy a custom field but forget the validation rule that references it, the deployment will likely fail validation.
If you deploy an Apex class but forget the custom object it uses, the deployment will fail. Always check the View/Add Dependencies button on the change set detail page to identify and include necessary components.
E. Make sure change sets are limited to 10,000 files.
While the technical limit for a single change set is 10,000 files, deployments of this size are very difficult to manage, validate, and troubleshoot.
The best practice is to keep change sets small and focused (e.g., related to a single project or user story).
Splitting large releases into multiple, sequential change sets is recommended for stability and faster deployment times.
Why the Other Options are Incorrect
B. Use matching names for global publisher layouts and Outlook publisher layouts:
This is a naming convention best practice for consistency, but it is not a best practice that affects the technical success or risk management of the deployment process itself.
C. Be sure to test only after business hours the data after deployment:
Testing is crucial, but it should be done before deployment in a sandbox and immediately after deployment in production, regardless of business hours. Testing the data (functional testing) must be a planned, thorough step, not just limited by time of day.
Cloud Kicks (CK) has introduced its new Alpha Shoe line. Customers create cases from
CK's website. Managers receive a report of all cases created last week. Managers would
like a way to easily see in the report if the customer refers to the new shoe line in the case
subject.
How should the system administrator modify thr report meet this request?
A. Add a cross-filter and a with' sub-filter.
B. Build a row-level formula.
C. Change the format to a joined repi
D. Include a contains filter on Subject.
Explanation:
Why B is correct
A row-level formula can add a column like Mentions Alpha Shoe that returns Yes/No (or TRUE/FALSE) by checking whether Subject contains the phrase (e.g., IF(CONTAINS(Subject, "Alpha Shoe"), "Yes", "No")). This keeps all cases in the report (last week) while letting managers quickly see which ones mention the new line.
Why the others are incorrect
A. Add a cross-filter and a ‘with’ sub-filter – Cross-filters filter related records; they don’t inspect text within a field like Subject, nor add a per-row flag.
C. Change the format to a joined report – Joined reports combine multiple report blocks; they don’t solve the need to flag text matches in a field.
D. Include a contains filter on Subject – That would filter out cases that don’t mention Alpha Shoe, but managers want to see all cases and simply highlight which ones do.
AW Computing has a new requirement from its security team where audit information
relating to an account must be recorded in a new custom object called Audit. Audit records
need to be preserved for 10 years and only accessible by the audit team.
What relationship should be used to relate the Audit object to the Account object?
A. Master-Detail
B. Lookup
C. Many-To-Many
D. Self
Explanation:
Why B is correct: A Lookup Relationship is the appropriate choice here.
Preservation of Records:
The key requirement is that "Audit records need to be preserved for 10 years." In a Master-Detail relationship, if the parent (master) record is deleted, all child (detail) records are also automatically and irreversibly deleted. An Account record might legitimately be merged or deleted for business reasons, but the associated Audit records must be preserved for a decade. A Lookup relationship allows this, as the child (Audit) records remain even if the parent (Account) is deleted (they become orphaned, but they are preserved).
Independent Security:
The requirement states audit records are "only accessible by the audit team." A Lookup relationship allows the Audit object to have its own independent sharing model, page layouts, and profiles/permissions. This makes it easier to restrict access solely to the audit team without affecting the security model of the Account object.
Why A is incorrect:
A Master-Detail Relationship would violate the core requirement. If an Account (the master) were ever deleted, all its associated Audit (the detail) records would be automatically deleted, destroying the 10-year audit trail.
Why C is incorrect:
A Many-to-Many relationship is implemented using a junction object. This is used when one Account can be linked to many Audits and one Audit needs to be linked to many Accounts. This is an unnecessary complexity for this scenario, which is a standard one-to-many relationship (one Account, many Audit records). Furthermore, the junction object would itself be a detail in two master-detail relationships, inheriting the same record deletion problem.
Why D is incorrect:
A Self-Relationship is a lookup relationship from an object back to itself (e.g., relating an Account to a Parent Account). It is completely irrelevant for relating an Audit object to an Account object.
Reference:
Salesforce Help: Relationship Considerations - This document explains the critical difference: "In a master-detail relationship, the detail record doesn’t exist as a standalone record—it’s strongly tied to its master. When a master record is deleted, all its detail records are deleted as well."
Page 8 out of 19 Pages |
Previous |