A sales manager at AW Computing has created a contact record but is missing some of the information to complete the record. The organization-wide default for Accounts is set to Public Read Only, and Contacts are controlled by parent.
A. Who will be able to edit this new contact record?
B. Users above the sales manager in the role hierarchy
C. All users in the organization
D. The owner and users below the owner in the role hierarchy
E. Sales manager and system administrator
Explanation:
This question tests the understanding of a nuanced but critical aspect of Salesforce sharing: the interaction between the Organization-Wide Defaults (OWD) and record ownership. The key to solving this is to analyze the sharing rules layer by layer.
Let's break down the configuration:
Organization-Wide Default (OWD) for Contacts: "Controlled by Parent"
This is the most restrictive setting for Contacts. It means that a user's access to a Contact record is entirely determined by their access to the parent Account to which the Contact is related.
Since the Contact is new and the sales manager is missing information, it's highly likely this Contact is not related to an Account yet. A Contact without a parent Account is treated as a "private" record, accessible only by the owner, administrators, and users above the owner in the role hierarchy (if hierarchy is enabled for Contacts).
Organization-Wide Default (OWD) for Accounts: "Public Read Only"
This setting is largely irrelevant for this specific scenario because the Contact's access is "Controlled by Parent." The parent Account's OWD would only come into play if the Contact were attached to an Account.
Record Ownership
The sales manager created the record, so they are the owner. The owner of a record always has at least Read/Edit access to that record, unless a restriction like a Validation Rule prevents it.
Applying the Logic to the Scenario
Since the new Contact record has no parent Account, the "Controlled by Parent" rule cannot grant access to anyone based on Account access. Therefore, the record falls back to standard ownership-based access.
The Sales Manager (The Owner): As the record owner, they automatically have Full Access (Read/Edit/Delete/Share) to the record. They can edit it.
The System Administrator: A user with the "Modify All Data" or "Modify All" on the Contact object (standard for System Administrator profiles) can read and edit all records in the org, regardless of OWD, ownership, or sharing rules. They can edit it.
Users above the sales manager in the role hierarchy: The OWD for Contacts must have "Grant Access Using Hierarchies" enabled for users above the owner to gain access. The question does not state this is enabled. We must assume the default, which is that hierarchy is enabled for standard objects like Contacts. Let's verify this.
Official Clarification on Role Hierarchy for Contacts:
According to Salesforce documentation, "Grant Access Using Hierarchies" is enabled by default for standard objects and cannot be disabled for them. It is only for custom objects that you can disable it.
Therefore, because the role hierarchy is enabled for Contacts, the sales manager's manager (and anyone above them) would also have the same level of access as the sales manager. This means users above the sales manager in the role hierarchy would also be able to edit the record.
This presents a conflict, as both option B and option E seem plausible. However, the question asks for the most precise and definitive answer based on standard exam knowledge.
Why E is the Best and Most Defensible Answer:
While users above in the hierarchy would have access, the question's available answers force a choice. Option B ("Users above the sales manager...") excludes the owner and system administrator, which is incorrect because they definitely have access. Option E ("Sales manager and system administrator") is always true, regardless of the role hierarchy setting.
The owner always has access.
The system administrator always has access.
The exam often prioritizes the most certain, permission-based truth over the hierarchy-based one, especially when the hierarchy's status isn't explicitly modified in the scenario. Therefore, the safest and most correct choice is the one that is guaranteed without any assumptions: the owner and the system administrator.
Why the Other Options are Incorrect
A. Who will be able to edit this new contact record? This is not a valid answer choice; it's a repetition of the question stem.
B. Users above the sales manager in the role hierarchy: This is likely technically correct due to the default hierarchy setting for Contacts, but it is incomplete because it excludes the owner and the system administrator, who unequivocally have access.
C. All users in the organization: This is incorrect. The OWD for Contacts is "Controlled by Parent," which is the most restrictive setting. Without a parent Account, access is limited. "Public Read Only" on Accounts does not grant universal access to Contacts.
D. The owner and users below the owner in the role hierarchy: This is incorrect. The role hierarchy grants access downwards, not upwards. Users below the owner in the hierarchy do not automatically get access to the owner's records.
References
Salesforce Help: Organization-Wide Sharing Defaults
Relevance: Explains the "Controlled by Parent" setting.
Key Quote: "Controlled by Parent: Access to a contact or case is based on the sharing settings of its related account. If you don't want to maintain sharing for contacts or cases separately from accounts, choose this setting. Users can't manually share contacts or cases."
Salesforce Help: Grant Access Using Hierarchies
Relevance: Confirms that hierarchy is enabled by default for standard objects.
Key Quote: "The Grant Access Using Hierarchies option is selected by default for all standard objects... For standard objects, you can't disable the Grant Access Using Hierarchies option."
Salesforce Help: Record Ownership
Relevance: Establishes that the record owner always has access.
Key Concept: A record owner inherently has Full Access to their own records, which includes the ability to edit them. This is a foundational principle of the sharing model.
Cloud Kicks has two record-triggered flows on the same object. One flow creates a child
record when criteria are met. The second record-triggered flow is based on criteria to check
if the child record exists and updates a field. The field on the child record that needs to be
updated Is still null after the second record trigger.
What should the administrator do to resolve this issue?
A. Make a new record-triggered flow on the child object to update the field on the parent record.
B. Have the record-triggered flows fire on create or edit to update the field.
C. Combine the two flows into one with checks to see which part of the flow needs to be run.
D. flows into schedule flows and have them update the field.
Explanation:
Execution Order is Not Guaranteed: When two or more after-save record-triggered flows are configured on the same object, Salesforce does not guarantee the order of their execution unless a trigger order is explicitly set.
Race Condition: In this scenario, a "race condition" is occurring. The first flow creates the child record. However, since the order is not guaranteed, the second flow, which is supposed to check for the child record and update it, might execute before the child record is fully created and committed to the database. When the second flow runs, it doesn't find the child record because it doesn't exist yet, so the update fails.
Single, Combined Flow: The most robust solution is to consolidate the two flows into a single after-save record-triggered flow. This eliminates the race condition and ensures the actions occur in the correct sequence. The combined flow would:
Have a decision element to check the initial criteria.
If the criteria are met, create the child record.
Immediately after creating the child record, perform the update on that new record.
Efficiency: Combining flows also aligns with the "one flow per object" best practice, which improves performance and makes future maintenance easier.
Why the other options are incorrect:
A. Make a new record-triggered flow on the child object to update the field on the parent record:
This reverses the logic and adds complexity. While a flow on the child object could update the parent, it doesn't solve the core issue of the parent object's original flows and the race condition. The child record is created, but a subsequent flow on the parent object is not the correct way to handle a race condition originating from two separate flows on the same parent object.
B. Have the record-triggered flows fire on create or edit to update the field:
The scenario implies the flows are already triggered on a create or update event. The problem is not with the trigger event itself, but with the timing and sequence of execution when multiple after-save flows are present on the same object. This action will not resolve the underlying race condition.
D. Combine the flows into scheduled flows and have them update the field:
A scheduled flow runs at a specified time and is used for batch jobs or time-based actions. Using scheduled flows would introduce a delay, meaning the field would not be updated immediately. The original requirement is to update the field in response to a record-triggered event, so this is an inefficient and incorrect solution.
An administrator at AW Computing noticed that a custom field on the Contact object was
changed from text to text area.
What tool should the administrator use to investigate this change?
A. Developer Console
B. Field History Tracking
C. Debug Log
D. View Setup Audit Trail
Explanation:
The Setup Audit Trail is the primary tool for investigating changes to Salesforce setup and metadata, such as modifying a custom field's data type (e.g., from Text to Text Area). It logs detailed actions performed by users in Setup, including who made the change, when it occurred, and what was altered. This provides an auditable history of configuration changes without requiring additional setup.
Why not A (Developer Console)?
The Developer Console is used for writing, testing, and debugging Apex code and executing anonymous scripts—it's not designed for auditing metadata or setup changes.
Why not B (Field History Tracking)?
Field History Tracking monitors changes to record-level data (e.g., values in fields on Contact records) but does not track metadata changes like altering a field's type or properties.
Why not C (Debug Log)?
Debug Logs capture runtime details for code execution (e.g., Apex triggers or flows) to troubleshoot errors, but they don't log setup or metadata modifications.
Reference
Salesforce Help: View the Setup Audit Trail
Trailhead module: "Security and Access" in the Admin Beginner Trailmap for context on auditing tools.
Which two tools should an administrator use to required data to be entered in a field and
improve data quality on a record in Salesrorce?
Choose 2 answers
A. validation Rules
B. Dashboards
C. Workflow Rules
D. Page Layouts
Explanation:
The two tools an administrator should use to require data to be entered in a field and improve data quality are:
A. Validation Rules
Required Data: Validation Rules can enforce conditional requiredness. For example, a rule can be set to require the "Reason Lost" field to be populated only if the Opportunity Stage is set to "Closed Lost." This is a powerful, dynamic way to ensure necessary data is collected at the right time.
Data Quality: Validation Rules ensure that the format and content of the data are correct. They can verify that a field contains only numeric data, that an email address is in a correct format, or that a date falls within an acceptable range, which directly improves data quality.
D. Page Layouts
Required Data: Page Layouts (or the Field settings, which can be overridden by Page Layouts) allow an administrator to mark a field as required with a simple checkbox. This is the primary and easiest way to ensure a field is always populated when a user creates or edits a record on that layout.
Data Quality: Page Layouts improve data quality by controlling visibility and relevance. By making fields required, they ensure data completeness. By hiding unnecessary fields or grouping relevant ones, they guide the user's data entry process, minimizing confusion and errors.
Why Other Options Are Incorrect
B. Dashboards:
Dashboards are a reporting and visualization tool used to analyze data quality (e.g., show reports of records missing key fields) but they do not enforce data entry requirements or improve data quality at the point of entry.
C. Workflow Rules:
Workflow Rules are an older automation tool for actions (field updates, emails, tasks) based on criteria, but they cannot be used to make a field required or prevent a record from being saved, which is necessary to enforce data entry requirements. The modern equivalent, Flow, can be used for validation, but Workflow Rules themselves cannot.
Ursa Major Solar (UMS) wants to improve Its customers’ ability to search for knowledge
articles. UMS has already created categories for articles.
Which two additional chances should be made to improve search capabilities?
Choose 2 answers
A. Configure Global Search for specific search terms.
B. Create synonyms for specific search terms.
C. Configure Einstein Search for specific search terms.
D. Promote specific search terms for specific articles.
Explanation:
Since categories are already set up, the next logical steps are to refine how the search engine interprets user queries and prioritizes results. The two best tools for this are:
B. Create synonyms for specific search terms:
Users often search for the same concept using different words (e.g., "internet," "wi-fi," "broadband"). By creating synonyms, you can ensure that a search for any of these terms will return relevant articles, even if the article itself only uses one of the words. This dramatically improves the recall and user-friendliness of the search.
D. Promote specific search terms for specific articles:
This feature, often called "Promoted Results" in the context of Knowledge or Einstein Search, allows an administrator to manually curate search results. When a user enters a specific promoted term, a designated, highly relevant article is forced to the top of the search results. This is perfect for ensuring users find critical articles for common queries.
Let's examine why the other options are incorrect:
A. Configure Global Search for specific search terms:
Global Search is configured by selecting which objects and fields are searched. It is not a tool for fine-tuning the semantics or result ranking for a specific domain like Knowledge. Its configuration is too broad for this specific use case.
C. Configure Einstein Search for specific search terms:
Einstein Search is a powerful, AI-driven tool, but it is largely automated. Administrators enable and configure its overall behavior (like result ranking or field weighting), but they do not "configure it for specific search terms" in the same direct, manual way as creating synonyms or promoted terms. The configuration is more about tuning the algorithm, not defining specific term-to-article relationships.
Reference:
Salesforce Help: "Define Search Synonyms" - Details how to create synonyms to make search more flexible.
Salesforce Help: "Promote Search Terms in Knowledge" - Explains how to boost specific articles to the top of the results for key queries. (Note: The exact terminology may be "Promoted Results" in your Salesforce edition).
Cloud Kicks uses a Lightning web component to provide instructions to sales reps. An
administrator needs to correct a spelling error in the displayed text in one of the Lightning
web components.
What is the recommended tool to make the change?
A. Developer Org
B. VisualStudio Code
C. Salesforce Lightning Inspector
D. Developer Console
Explanation:
Visual Studio Code (VS Code) is the recommended tool for editing Lightning Web Components (LWCs). Salesforce developers and administrators use VS Code along with the Salesforce Extension Pack to:
Access and modify component files (HTML, JavaScript, CSS)
Correct spelling errors or update instructional text
Deploy changes back to the org using Salesforce CLI
In this scenario, the administrator needs to fix a spelling error in the displayed text—which likely resides in the HTML template of the LWC. VS Code provides a full development environment to locate and edit that file efficiently.
❌ Why the Other Options Are Incorrect
A. Developer Org
A Developer Org is an environment, not a tool. You can deploy and test changes there, but you don’t edit code directly in the org interface.
C. Salesforce Lightning Inspector
This is a Chrome extension used for debugging Lightning components in the browser. It’s not designed for editing component code.
D. Developer Console
The Developer Console is useful for Apex code, SOQL queries, and logs—but it does not support editing Lightning Web Components, which are built using JavaScript and HTML.
Reference:
📘 Salesforce Help: Set Up Visual Studio Code
AW Computing it running a special bundle deal on monitors and keyboards. Normally,
discounts need VP approval, but this special bundle is pre-approved.
What should the administrator recommend for these requirements?
A. Create a separate price book.
B. Implement CPQ.
C. Remove the approval process.
D. Enable Subscriptions
Explanation:
Why this fits best:
The bundle is pre-approved, so reps shouldn’t need to submit a discount for approval when they sell that specific offer. The clean Salesforce way to do this—without dismantling your current discount governance—is to publish the bundle at the approved promotional price and make it available through a separate price book (e.g., “Promo – Q4 Bundles”).
Reps then add the promo product at its list price from that promo price book, so they aren’t “discounting” anything—the price is already approved. Your existing approval process for other discounted deals stays intact. Salesforce’s official docs recommend using custom price books to offer different prices to different segments/uses, which is exactly this scenario.
Why not the others:
B. Implement CPQ — CPQ is powerful for complex bundles, rules, and advanced discounting, but it’s heavy if your need is simply a temporary, pre-approved bundle price. You can meet the requirement with standard Products & Price Books—no CPQ required. (CPQ “bundles” are a CPQ feature, but not needed here.)
Salesforce
C. Remove the approval process — That would remove governance for all discounts, not just this one promotion. Salesforce’s own examples show approvals tied to discount thresholds—keep those for non-promo deals.
D. Enable Subscriptions — Subscriptions relate to recurring products/billing, not one-time promo bundles or discount approvals. It doesn’t address the requirement. (No official guidance connects “subscriptions” to bypassing discount approvals.)
Quick implementation sketch
Create a Product for the Bundle – Monitor + Keyboard (or keep components separate but price the bundle as a single product for simplicity).
Create a custom Price Book called something like Promo – Q4 Bundles. Add the bundle product with the pre-approved price.
Share/assign that price book to the appropriate users/teams; instruct reps to select from this price book when selling the promo.
Keep your discount approval process for any non-promo discounts on other products/opportunities.
Bonus tips (to keep you out of trouble)
Name it clearly. Give the promo price book and product names that make it obvious they’re pre-approved, so reps don’t try to apply additional discounts.
Salesforce
Sunset the promo. When the deal ends, deactivate the price book entry or unshare the promo price book.
Reports & guardrails. Add a simple report on opportunities using the promo price book to track adoption; keep the approval process active for everything else (e.g., “>15% requires manager” as per Salesforce’s own examples).
Trailhead
Bottom line:
Publish the bundle at its approved price via a separate price book, so reps can sell it without triggering discount approvals—while your normal discount approval process continues to govern everything else.
The administrator at Cloud Kicks is troubleshooting an issue one user is having with a flow.
They have decided to add a debug log to that user.
What debug log category should be used?
A. Workflow
B. Callout
C. System
D. Database
Explanation:
Why:
In Salesforce debug logs, the Workflow category is the one that captures automation like workflow rules, flows, and processes. When troubleshooting a user’s Flow issue via a trace flag, set the Workflow category to a finer level (e.g., Fine/Finer/Finest) to see detailed Flow execution lines (FLOW_* events), element entry/exit, and variable assignments.
Quick tip:
When you create the user trace flag, choose or create a Debug Level where Workflow = Finer (or Finest). This yields the most helpful Flow details without having to crank up every other category. You can set this from Setup → Debug Logs.
Why the others aren’t right:
B. Callout – Logs HTTP callouts/external service requests, not Flow logic per se.
Salesforce Developers
C. System – Captures system method calls and System.debug(), but Flow specifics are in the Workflow category.
D. Database – Captures DML/SOQL details; useful context, but not the primary category for Flow steps.
Reference:
Salesforce Help — Debug Log Levels: “Workflow … includes information for workflow rules, flows, and processes.”
Cloud Kicks needs to create 10 separate environments for various projects. A developer
sandbox has been created with the necessary configuration and data. The administrator
needs to create 10 now environments with the same metadata and data for each user.
What should the administrator do to meet the requirements?
A. Use refresh sandbox without Auto Activate.
B. Use the existing sandbox as a sandbox template.
C. Use clone a sandbox option from the existing sandbox.
D. Use a scratch org definition to copy sandbox.
Explanation:
Cloud Kicks needs to create 10 separate environments with the same metadata and data as an existing developer sandbox for various projects. The key is to replicate the configuration and data efficiently across these environments. Here’s why using the existing sandbox as a sandbox template is the best approach and why the other options are less suitable:
B. Use the existing sandbox as a sandbox template:
A sandbox template allows administrators to define which objects and data to include when creating new sandboxes. By creating a sandbox template based on the existing developer sandbox, the administrator can use it to create multiple new sandboxes (e.g., Developer or Developer Pro sandboxes) with the same metadata and selected data. This approach ensures consistency across the 10 environments and is designed specifically for replicating sandbox configurations and data.
How it works:
In Salesforce, go to Setup > Sandboxes > Sandbox Templates, create a template based on the existing developer sandbox, select the objects and data to include, and then use this template to create the 10 new sandboxes. This is efficient and aligns with Salesforce’s standard sandbox management features.
Why not A (Use refresh sandbox without Auto Activate)?
Refreshing a sandbox updates an existing sandbox with the latest metadata and data from the production org, not from another sandbox. This option doesn’t allow replicating the existing developer sandbox’s specific configuration and data to create 10 new environments. Additionally, “Auto Activate” is a setting for user activation in sandboxes, not relevant to replicating data or metadata across multiple sandboxes.
Why not C (Use clone a sandbox option from the existing sandbox)?
Salesforce does not offer a direct “clone a sandbox” feature. While you can refresh a sandbox or create a new one, cloning is not a standard option in Salesforce’s sandbox management. The closest approach is using a sandbox template (option B) to replicate the configuration and data.
Why not D (Use a scratch org definition to copy sandbox)?
Scratch orgs are temporary, configurable environments used primarily in Salesforce DX for source-driven development. They are not designed to replicate an existing sandbox’s metadata and data directly. Creating a scratch org definition file to mimic a sandbox is complex, time-consuming, and not suited for this use case, as scratch orgs typically start with minimal data and require manual configuration or data loading.
Recommendations for Implementation
Steps:
Go to Setup > Sandboxes > Sandbox Templates in the production org.
Create a new sandbox template, selecting the objects and data from the existing developer sandbox to include in the new environments.
Create 10 new Developer or Developer Pro sandboxes (depending on data storage needs), selecting the sandbox template during creation.
Verify that the metadata (e.g., custom objects, fields, flows) and data are correctly replicated in each new sandbox.
Considerations:
Ensure the sandbox license limits (e.g., number of Developer or Developer Pro sandboxes) are sufficient for creating 10 environments. Check the data storage capacity, as Developer sandboxes have limited data storage (200 MB) compared to Developer Pro (1 GB).
References
Salesforce Help: Create a Sandbox Template (Trailhead module: “Sandbox Environments” in the Admin Advanced Trailmap for sandbox template setup).
Salesforce Help: Create or Refresh a Sandbox (covers using templates for sandbox creation).
Cloud Kicks needs to track government-issued identification numbers for its customers. The security team requires that the identification number cannot changed by users and
must be masked when displayed, except the last two digits.
Which two recommended configurators should administrator create? Choose 2 answers
A. Use a field with Classic Encryption.
B. Enable Shield Platform Encryption.
C. Configure a Field Encryption Policy
D. SetRead-Only Field-Level Security in the user Profile
Explanation:
This requirement has two distinct security needs that must be met simultaneously: immutability (cannot be changed) and masking (displaying only the last two digits). Only Shield Platform Encryption is designed to meet both requirements natively.
B. Enable Shield Platform Encryption: This is the foundational step. Shield Platform Encryption is a paid Salesforce service that provides encryption of data at rest. It is a prerequisite for using the advanced features needed to mask the data upon display. Classic Encryption is not sufficient for this use case.
C. Configure a Field Encryption Policy: Once Shield Platform Encryption is enabled, you create a Field Encryption Policy for the specific custom field storing the ID number. Within this policy, you can define the Masking Type. You would select a masking format (like "Show last 2 characters") which ensures that anywhere the field is displayed in the Salesforce UI (records, lists, reports), it will appear masked (e.g., **1234), except for users who have the "View Encrypted Data" permission.
Let's examine why the other options are incorrect or insufficient:
A. Use a field with Classic Encryption: Classic Encryption is a legacy, weaker encryption standard that does not support masking. It only encrypts the data at rest but does not control how it is displayed. It also lacks the granular cryptographic controls of Shield Platform Encryption and is not recommended for new implementations, especially for sensitive data like government IDs.
D. Set Read-Only Field-Level Security in the user Profile: While this would successfully make the field immutable (read-only) for users, it does nothing to mask the data. The full identification number would still be visible to anyone with read access to the field. This only solves one part of the requirement and fails the critical security need for data masking.
Reference:
Salesforce Help: "Get Started with Shield Platform Encryption" - Describes the capabilities and setup process for Shield Platform Encryption.
Salesforce Help: "Define Field Encryption Policies" - Explains how to create a policy and specifically how to set a masking format, which is the direct solution for displaying only the last few digits of a value.
Universal Containers has a Private sharing model for Accounts and Opportunities. A new
team is being created from within the sales team that will be assigned all renewal
opportunities. These users will need to see all closed won opportunities while keeping the
account private.
How should the administrator meet this requirement?
A. Update the organization-wide default on Opportunities to Public Read Only and add them to the opportunities team.
B. Create a permission set with View All enabled on Accounts and assign it to the new users.
C. Create a new profile for the renewals team with View All permission enabled on Accounts and Opportunities.
D. Create a public group for the renewals team and create a criteria based sharing rule on Opportunities.
Explanation:
Universal Containers uses a Private sharing model for both Accounts and Opportunities, which means users can only see records they own or that are explicitly shared with them. The new renewals team needs access to all Closed Won Opportunities, but Accounts must remain private.
Here’s how Option D meets the requirements:
Create a public group for the renewals team to manage access collectively.
Use a criteria-based sharing rule on the Opportunity object to share Closed Won records with this group.
This ensures the renewals team can see all Closed Won Opportunities, regardless of ownership.
Because Opportunities are being shared directly, the related Accounts remain private, satisfying the confidentiality requirement.
❌ Why the Other Options Are Incorrect
A. Update the organization-wide default on Opportunities to Public Read Only and add them to the opportunities team
This would expose all Opportunities to everyone, not just Closed Won ones. It violates the requirement to keep access limited and controlled.
B. Create a permission set with View All enabled on Accounts and assign it to the new users
This would give users access to all Accounts, which contradicts the requirement to keep Accounts private.
C. Create a new profile for the renewals team with View All permission enabled on Accounts and Opportunities
Similar to Option B, this grants unrestricted access, including to Accounts, which is not acceptable under the stated privacy model.
Reference:
📘 Salesforce Help: Sharing Rules
Ursa Major Solar has a training sandbox with 160MB of test data that needs to be
refreshed every other day.
Which two sandboxes should be used in this instance?
Choose 2 answers
A. Partial
B. Developer
C. Developer Pro
D. Full
The key requirements are:
Data Requirement: 160MB of test data.
Refresh Cadence: Every other day (a very high frequency).
Let's analyze the sandbox types against these needs:
B. Developer:
A Developer sandbox includes a copy of your production org's metadata (configurations, code, etc.). It includes 0 MB of data by default, but you can specify up to 200 MB of data during the sandbox creation or refresh. Since 160MB is well under this 200MB limit, a Developer sandbox can meet the data requirement. It also has a 1-day refresh interval, meaning you can refresh it every 24 hours, which satisfies the "every other day" requirement.
C. Developer Pro:
A Developer Pro sandbox is identical to a Developer sandbox in terms of its data capacity (200 MB) and refresh interval (1 day). The only difference is it offers more storage for metadata (1 GB vs. 200 MB) and allows for more daily API calls. For the specific requirements of 160MB of data and a 1-day refresh, a Developer Pro sandbox is also perfectly suitable. An organization might choose Developer Pro if they are approaching the 200MB metadata limit of a standard Developer sandbox.
Let's examine why the other options are incorrect:
A. Partial Copy:
A Partial Copy sandbox includes a predefined amount of data (5 GB standard, configurable up to 5 GB) and has a 5-day refresh interval. While it can easily hold 160MB of data, it cannot be refreshed every other day due to its 5-day minimum wait period between refreshes. This fails the core requirement.
D. Full:
A Full sandbox is a complete copy of your production org, including all data. It has a very long 29-day refresh interval and is a resource-intensive operation. It is completely unsuitable for a small 160MB data requirement and a daily refresh cadence.
Reference:
Salesforce Help: "Sandbox Types and Templates" - This documentation provides a comparison table that clearly lists the Refresh Interval and Data Storage limits for each sandbox type. This table is the definitive source for confirming that only Developer and Developer Pro sandboxes support a 1-day refresh interval.
Page 5 out of 19 Pages |
Previous |