Platform Data Architect
|
|
Título del Test:
![]() Platform Data Architect Descripción: Platform Data Architect |



| Comentarios |
|---|
NO HAY REGISTROS |
|
Northern Trail Outfitters (NTO) has a loyalty program to reward repeat customers. The following conditions exist: 1. Reward levels are earned based o n the amount spent during the previous 12 months. 2. The program will track every item a customer has bought and grant them points for discounts. 3. The program generates 100 million records each month. NTO Customer Support would like to see a summary of a customer's recent transactions and the reward level(s) they have attained. Which solution should the data architect use to provide the information within Salesforce for the customer support agents?. Create a custom object in Salesforce to capture and store all reward programs, populate nightly from the point-of-sale system, and present on the customer record. Provide a button so that the agent can quickly open the point-of-sale system that displays the customer's history. Capture the reward program data in an external data store, and present the 12-month trailing summary in Salesforce using Salesforce Connect and an external object. Create a custom big object to capture the reward program data, display it on the contact record, and update nightly from the point-of-sale system. Universal Containers (UC) owns a complex Salesforce org with many Apex classes, triggers, and automated processes that will modify records if available. UC has identified that, in its current development state, UC runs chance of encountering race conditions on the same record. What should a data architect recommend to guarantee that records are not being updated at the same time?. Refactor or optimize classes and triggers for maximum CPU performance. Disable classes or triggers that have the potential to obtain the same record. Embed the keywords FOR UPDATE after SOQL statements. Migrate programmatic logic to processes and flows. Universal Containers (UC) is transitioning from Classic to Lightning Experience. What does UC need to do to ensure users have access to its notes and attachments in Lightning Experience?. Add Notes and Attachments Related Lists to Page Layouts in Lightning Experience. Manually upload Notes in Lightning Experience. Manually upload Attachments in Lightning Experience. Migrate Notes and Attachments to Enhanced Notes and Files using a migration tool. Universal Containers (UC) uses the following Salesforce products: • Sales Cloud for customer management. • Marketing Cloud for marketing. • Einstein Analytics for business reporting. UC occasionally gets a list o f prospects from third-party sources a s comma-separated values (CSV) files for marketing purposes. Historically, UC would load these contacts into Lead object in Salesforce and sync to Marketing Cloud to send marketing communications. The number of records in the Lead object has grown over time and has been consuming large amounts of storage in Sales Cloud. UC is looking for recommendations to reduce the storage and advice on how to optimize the marketing process. What should a data architect recommend to UC in order to immediately avoid storage issues in the future?. Load the contacts directly to Marketing Cloud and have a reconciliation process to track prospects that are converted to customers. Load the CV files in a n external database and sync with Marketing Cloud prior to sending marketing communications. Load the CV files in Einstein Analytics and sync with Marketing Cloud prior to sending marketing communications. Continue to use the existing process to use Lead object to sync with Marketing Cloud and delete Lead records from Sales Cloud after the sync is complete. Northern Trail Outfitters (NTO) operates a majority of its business from a central Salesforce org. NTO also owns several secondary orgs that the service, finance, and marketing teams work out of. At the moment, there is no integration between central and secondary orgs, leading to data-visibility issues. Moving forward, NTO has identified that a hub-and-spoke model is the proper architecture to manage its data, where the central org is the hub and the secondary orgs are the spokes. Which tool should a data architect use to orchestrate data between the hub org and spoke orgs?. Develop custom APIs to poll the hub org for change data and push into the spoke orgs. A backup and archive solution that extracts and restores data across orgs. Develop custom APIs to poll the spoke org for change data and push into the hub org. A middleware solution that extracts and distributes data across both the hub and spokes. During the implementation of Salesforce, a customer has the following requirements for Sales Order: 1. Sales Order information needs to be shown to users in Salesforce. 2. Sales Orders are maintained in the on-premise enterprise resource planning (ERP). 3. Sales Order information has more than 150 million records. 4. Sales Orders will not be updated in Salesforce. What should a data architect recommend for maintaining Sales Orders in Salesforce?. Use Standard Order object to maintain Sale Orders in Salesforce. Use custom big objects to maintain Sales Orders in Salesforce. Use external objects to maintain Sales Orders in Salesforce. Use custom objects to maintain Sales Orders in Salesforce. Northern Trail Outfitters (NTO) has been using Salesforce for Sales and Service for 10 years. For the past two years, the marketing group has noticed a rise from 0% to 35% in returned mail when sending mail using the contact information stored in Salesforce. Which solution should the data architect use to reduce the amount of returned mail?. Use a third-party data source to update the contact information in Salesforce. Delete contacts when the mail is returned to save postal costs for NTO. Have the sales team call all existing customers and ask to verify the contact details. Email all customers and ask them to verify their information and to call NTO if their address is incorrect. Northern Trail Outfitters has implemented Salesforce for its sales associates nationwide. Senior management is concerned that the executive dashboards are not reliable for their real-time decision-making. On analysis, the team found the following issues with data entered in Salesforce: 1. Information in certain records is incomplete. 2. Incorrect entry in certain fields causes records to be excluded in report filters. 3. Duplicate entries cause incorrect counts. Which three steps should a data architect recommend to address the issues? Choose 3 answers. Leverage Salesforce features, such as validation rules, to avoid incomplete and incorrect records. Build a sales data warehouse with purpose-built data marts for dashboards and senior management reporting. Design and implement data-quality dashboard to monitor and act on records that are incomplete or incorrect. Explore third-party data providers to enrich and augment information entered in Salesforce. Periodiccally export data to cleanse data and import them back into Salesforce for executive reports. Universal Containers (UC) has adopted Salesforce as its primary sales automation tool. UC has 100,000 customers with a growth rate of 10% a year. UC uses an on-premise web-based billing and invoice system that generates over 1 million invoices a year supporting a monthly billing cycle. The UC sales team needs to be able to pull up a customer record and view their account status, invoice history, and open opportunities without navigating outside of Salesforce. What should a data architect use to provide the sales team with the required functionality?. Create a custom object and migrate the last 12 months of invoice data into Salesforce so it can be displayed on the Account layout. Create a visual force tab with the billing system encapsulated within an iframe. Write an Apex callout and populate a related list to display on the account record. Create a mashup page that will present the billing system records within Salesforce. Universal Containers (UC) has implemented Salesforce. UC is running out of storage and needs to have an archiving solution. UC would like to maintain two years of data in Salesforce and archive older data out of Salesforce. Which solution should a data architect recommend as an archiving solution?. Use a third-party backup solution to back up al data of platform. Build a batch job to move al records off platform, and delete all records from Salesforce. Build a batch job to move two-year-old records off platform, and delete old records from Salesforce. Build a batch job to move all records off platform, and delete old records from Salesforce. A custom pricing engine for a Salesforce customer has to be decided by factors with the following hierarchy: 1. State in which the customer is located 2. City in which the customer is located if available 3. Zip code in which the customer is located if available 4. Changes to this information should have minimum code change What should a data architect recommend to maintain this information for the custom pricing engine that is to be built in Salesforce?. Maintain required pricing criteria in custom metadata types. Configure the pricing criteria in price books. Assign the pricing criteria within custom pricing engine. Create a custom object to maintain the pricing criteria. Universal Containers has a rollup summary field on account to calculate the number of contacts associated with an account. During the account load, Salesforce is throwing an "UNABLE_TO_LOCK_ROW" error. Which solution should a data architect recommend to resolve the error?. Perform a batch job in parallel mode and reduce the batch size. Leverage Data Loader's platform API to load data. Defer rollup summary field calculation during data migration. Perform a batch job in serial mode and reduce the batch size. Universal Containers is preparing to implement Sales Cloud and would like its users to have Read Only access to an Account record if they have access to its child Opportunity record. How would a data architect implement this sharing requirement between objects?. Implicit sharing will automatically handle this with standard functionality. Create an owner-based sharing rule. Add appropriate users to the Account team. Create a criteria-based sharing rule. Northern Trail Outfitters (NTO) is in the process of evaluating big objects to store large amounts of asset data from an external system. NTO will need to report on this asset data weekly. Which two native tools should a data architect recommend to achieve this reporting requirement? Choose 2 answers. Einstein Analytics. Standard SOQL queries. Standard reports and dashboards. Async SOQL with a custom object. Northern Trail Outfitters uses Salesforce to manage relationships and track sales opportunities. It has 10 million customers and 100 million opportunities. The CEO has been complaining that a dashboard is taking 10 minutes to run and sometimes fails to load, throwing a time-out error. Which three options should help improve the dashboard performance? Choose 3 answers. Remove widgets from the dashboard to reduce the number of graphics loaded. Reduce the amount of data queried by archiving unused opportunity records. Run the dashboard for the CEO and send it via email. Denormalize the data by reducing the number of joins. Use selective queries to reduce the amount of data being returned. A customer is operating in a highly regulated industry and is planning to implement Salesforce. The customer information maintained in Salesforce, includes the following: 1. Personally identifiable information (PII) 2. IP restrictions on profiles organized by geographic location 3. Financial records that need to be private and accessible only by the assigned sales associate Enterprise Security has mandated access to be restricted to users within a specific geography with detailed monitoring of user activity. Additionally, users should not be allowed to export information from Salesforce. Which three Salesforce Shield capabilities should a data architect recommend? Choose 3 answers. Restrict access to Salesforce from users outside specific geography. Transaction Security policies to prevent export of Salesforce data. Encrypt sensitive customer information maintained in Salesforce. Event monitoring to monitor all user activity. Prevent sales users access to customer PII information. Universal Containers has a Sales Cloud implementation for a sales team and an enterprise resource planning (ERP) as a customer master. Sales teams are complaining about duplicate account records and data quality issues with account data. Which two solutions should a data architect recommend to resolve the complaints? Choose 2 answers. Build a nightly batch job to de-dupe data, and merge account records. Build a nightly sync job from ERP to Salesforce. Implement a de-dupe solution and establish account ownership in Salesforce. Integrate Salesforce with ERP, and make ERP as system of truth. Universal Containers is experiencing frequent and persistent group membership locking issues that severely restricts its ability to manage manual and automated updates at the same time. What should a data architect do in order to resolve the issue?. Enable implicit sharing. Enable granular locking. Enable parallel sharing rule calculation. Enable defer sharing calculation. Universal Containers (UC) is in the process of migrating legacy inventory data from an enterprise resource planning (ERP) system into Sales Cloud with the following requirements: 1. Legacy inventory data will be stored in a custom child object called Inventory_c. 2. Inventory data should be related to the standard Account object. 3. The Inventory_c object should inherit the same sharing rules as the Account object. 4. Anytime an Account record is deleted in Salesforce, the related Inventory_c record(s) should be deleted as well. What type of relationship field should a data architect recommend in this scenario?. Master-detail relationship field on Inventory__c, related to Account. Master-detail relationship field on Account, related to Inventory__c. Lookup relationship field on Inventory__c, related to Account. Indirect lookup relationship field on Account, related to Inventory__c. Northern Trail Outfitters (NTO) has the following systems: • Customer master-source of truth for customer information • Service cloud-customer support • Marketing cloud-marketing communications • Enterprise data warehouse-business reporting The customer data is duplicated across al these systems and are not kept in sync. Customers are also complaining that they get repeated marketing emails and have to call in to update their information. NTO is planning to implement a master data management (MDM) solution across the enterprise. Which three data issues will an MDM tool solve? Choose 3 answers. Data standardization. Data loss and recovery. Data completeness. Data duplication. Data accuracy and quality. Universal Containers (UC) has implemented Sales Cloud for its entire sales organization. UC has built a custom object called Projects__c that stores customer project details and employee billable hours. The following requirements are needed: 1. A subset of individuals from the finance team will need access to the Projects object for reporting and adjusting employee utilization. 2. The finance users will not need access to any sales objects, but they will need to interact with the custom object. Which license type should a data architect recommend for the finance team that best meets the requirements?. Service Cloud. Lightning Platform Starter. Sales Cloud. Lightning Platform Plus. Northern Trail Outfitters is migrating to Salesforce from a legacy CRM system that identifies the agent relationships in a look-up table. What should the data architect do in order to migrate the data to Salesforce?. Migrate the data and assign to a non-person system user. Create custom object to store agent relationships. Assign record owners based on relationship. Migrate to Salesforce without a record owner. Universal Containers has a requirement to store more than 100 million records in Salesforce and needs to create a custom big object to support this business requirement. Which two tools should a data architect use to build custom big object? Choose 2 answers. Go to Object Manager in Setup and select new to create big object. Go to Big Object in Setup and select new to create big object. Use DX to create big object. Use Metadata API to create big object. A large automobile company has implemented Salesforce for its sales associates. Leads flow from its website to Salesforce using a batch integration in Salesforce. The batch job converts the leads to Accounts in Salesforce. Customers visiting their retail stores are also created in Salesforce as Accounts. The company has noticed a large number of duplicate Accounts in Salesforce. On analysis, it was found that certain customers could interact with its website and also visit the store. The sales associates use Global Search to search for customers in Salesforce before they create the customers. Which option should a data architect choose to implement to avoid duplicates?. Leverage duplicate rules in Salesforce to validate duplicates during the account creation process. Implement an MDM solution to validate the customer information before creating Accounts in Salesforce. Develop an Apex class that searches for duplicates and removes them nightly. Build a custom search functionality that allows sales associates to search for customer in real time upon visiting their retail stores. Northern Trail Outfitters (NTO) uses Sales Cloud and Service Cloud to manage their sales and support processes. Some of NTOs teams are complaining they see new fields on their page and are unsure of which values need be input. NTO is concerned about lack of governance in making changes to Salesforce. Which governance measure should a data architect recommend to solve this issue?. Add description fields to explain why the field is used, and mark the fields as Required. Create validation rules with error messages to explain why the field is used. Create reports to identify which fields users are leaving blank, and use external data sources to augment the missing data. Create and manage a data dictionary, and use a governance process for changes made to common object. Universal Containers (UC) has a Salesforce org with multiple automated processes defined for group membership processing. UC also has multiple admins on staff that perform manual adjustments to the role hierarchy. The automated tasks and manual tasks overlap daily, and UC is experiencing "lock errors" consistently. What should a data architect recommend to mitigate these errors?. Enable granular locking. Remove SOQL statements from Apex Loops. Ask Salesforce support for additional CPU power. Enable sharing recalculations. A large retail company has recently chosen Salesforce as its CRM solution. They have the following record counts: • 2,500,000 Accounts • 25,000,000 Contacts When doing an initial performance test, the data architect noticed an extremely slow response-time for reports and list views. What should a data architect do to solve the performance issues?. Limit data loading to the 2,000 most recently created records. Add Custom Indexes on frequently searched Account and Contact fields. Create a Skinny Table to represent Account and Contact objects. Load only data that the user is permitted to access. Universal Containers (UC) is planning a massive Salesforce implementation with large volumes of data. As part of the org's implementation, several roles, territories, groups, and sharing rules have been configured. The data architect has been tasked with loading all of the required data, including user data, in a timely manner. What should a data architect do to minimize data load times due to system calculations?. Enable granular locking to avoid the "UNABLE_TO_LOCK_ROW" error. Leverage the Bulk API and concurrent processes with multiple batches. Load the data through Data Loader and turn on parallel processing. Enable defer sharing calculations and suspend sharing rule calculations. A large retail B2C customer wants to build a 360-degree view of its customers for its call center agents. The customer information is currently maintained in the following systems: 1. Salesforce CRM 2. Custom billing solution 3. Customer master data management (MDM) 4. Contract management system 5. Marketing solution What should a data architect recommend that would help uniquely identify the customer across multiple systems?. Create a customer database, and use this ID in all systems. Create a custom field as external ID to maintain the customer ID from the MDM solution. Store the Salesforce ID in al the solutions to identify the customer. Create a custom object that wil serve as a cross-reference for the customer ID. A large multinational B2C Salesforce customer is looking to implement their distributor management application in Salesforce. The application has the following capabilities: 1. Distributors create Sales Orders in Salesforce. 2. Sales Orders are based on Product prices applicable to their region. 3. Sales Orders are closed once they are fulfilled. 4. It is decided to maintain Sales Orders in Opportunities object. How should the data architect model this requirement?. Manually update Opportunities with Prices applicable to distributors. Add custom fields in Opportunity and use triggers to update prices. Configure Price Books for each region and share with distributors. Create lookup to Custom Price object and share with distributors. To address different compliance requirements such as General Data Protection Regulation (GDPR), personally identifiable information (PII), Health Insurance Portability and Accountability Act (HIPPA) and others, a Salesforce customer decided to categorize each data element in Salesforce with the following: 1. Data owner 2. Security level (i.e. confidential) 3. Compliance type i.e. GDPR, PII, HIPAA) A compliance audit would require Salesforce admins to generate reports to manage compliance. What should a data architect recommend to address this requirement?. Create a custom object and field to capture necessary compliance information and build custom reports. Build reports for field information, then export the information to classify and report for audits. Use the Metadata API to extract field attribute information and use the extract to classify and build reports. Use field metadata attributes for compliance categorization, data owner, and data sensitivity level. Universal Containers has 30 million case records. The Case object has 80 fields. Agents are reporting performance issues and time-outs while running case reports in the Salesforce org. Which solution should a data architect recommend to improve reporting performance. Create a custom object to store aggregate data and run reports. Move data off of the platform and run reporting outside Salesforce, and give access to reports. Build reports using custom Lightning components. Contact Salesforce support to enable skinny table for cases. Universal Containers (UC) is going through major reorganization of their sales team. This would require changes to a large a number of group membership and sharing rules. UCs administrator is concerned about long processing time and failures during the process. What should a Data Architect implement to make changes efficiently?. Log a case with salesforce to make sharing rule changes. Log out all users and make change to sharing rules. Enable Defer Sharing Calculation prior to making sharing rule changes. Delete old sharing rules and build new sharing rules. Universal Containers developers have created a new Lightning component that leverages an Apex controller using a SOQL query to populate a custom list view. Users are complaining that the component often fails to load and returns a timeout error. What tool should a data architect use to identify why the query is taking too long?. Open a ticket with Salesforce support to retrieve transaction logs to be analyzed for processing time. Use Salesforce's query optimizer to analyze the query in the developer console. Use Splunk to query the system logs looking for transaction time and CPU usage. Enable and use the Query Plan tool in the developer console. A customer wants to maintain geographic location information including latitude and longitude in a custom object. What should a data architect recommend to satisfy this requirement?. Recommend AppExchange packages to support this requirement. Create custom fields to maintain latitude and longitude information. Create formula fields with geolocation functions for this requirement. Create a geolocation custom field to maintain this requirement. Universal Containers (UC) is replacing a home grown CRM solution with Salesforce. UC has decided to migrate operational (open and active) records to Salesforce, while keeping historical records in legacy system. UC would like historical records to be available in Salesforce on an as needed basis. Which solution should a data architect recommend to meet business requirement?. Leverage real-time integration to pull records into Salesforce. Bring all data in Salesforce, and delete it after a year. Leverage mashup to display historical records in Salesforce. Build a swivel chair solution to go into the legacy system and display records. Universal Containers needs to load a large volume of leads into Salesforce on a weekly basis. During this process the validation rules are disabled. What should a data architect recommend to ensure data quality is maintained in Salesforce?. Develop a custom Batch Apex process to improve quality once the load is completed. Ensure the lead data is preprocessed for quality before loading into Salesforce. Allow validation rules to be activated during the load of leads into Salesforce. Activate validation rules once the leads are loaded into Salesforce to maintain quality. As part of addressing General Data Protection Regulation (GDPR) requirements, Universal Containers (UC) plans to implement a data classification policy for all of its internal systems that store customer information including Salesforce. What should a data architect recommend so that UC can easily classify customer information maintained in Salesforce under both standard and custom objects?. Build reports that contain customer data and classify manually. Create a custom picklist field to capture classification of information on the Contact object. Use an AppExchange product to classify fields based on policy. Use Data Classification metadata fields available in Field definition. The data architect for Universal Containers has written a SOQL query that will return all records from the Task object that do not have a value in the whatID field: SELECT ID, Description, Subject FROM Task WHERE WhatId != NULL. When the data architect uses the query to select values for a process, a time-out error occurs. What does the data architect need to change to make this query more performant?. Add LIMIT 100 to the query. Change query to SOSL. Change the WHERE clause to filter by a deterministic defined value. Remove description from the requested field set. Universal Containers is implementing Salesforce and needs to migrate data from two legacy systems. UC would like to clean and deduplicate data before migrating to Salesforce. Which solution should a data architect recommend a clean migration?. Define external IDs for an object, migrate second database to first database, and load into Salesforce. Set up staging data base, and define external IDs to merge, clean, deduplicate data, and load into Salesforce. Define external IDs for an object, insert data from one database, and use upsert for a second database. Define deduplicate rules in Salesforce, and load data into Salesforce from both databases. Universal Containers is using Salesforce for Opportunity management and enterprise resource planning (ERP) for order management. Sales reps do not have access to the ERP and have no visibility into order status. What solution should a data architect recommend to give the sales team visibility into order status?. Leverage Salesforce Connect to bring the order line item from the legacy system to Salesforce. Build real-time integration to pull order line items into Salesforce when viewing orders. Leverage Canvas to bring the order management UI in to the Salesforce tab. Build batch jobs to push order line items to Salesforce. Universal Containers (UC) is migrating from a legacy system to Salesforce CRM. UC is concerned about the quality of data being entered by users and through external integrations. Which two solutions should a data architect recommend to mitigate data quality issues? Choose 2 answers. Leverage picklist and lookup fields where possible. Leverage third-party AppExchange tools. Leverage Apex to validate the format of data being entered via a mobile device. Leverage validation rules and workflows. Northern Trail Outfitters (NTO) has a variety of customers that include households, businesses, and individuals. The following conditions exist within its system: 1. NTO has a total of five million customers. 2. Duplicate records exist, which is replicated across many systems, including Salesforce. Given these conditions, there is a lack of consistent presentation and clear identification of a customer record. Which three options should a data architect perform to resolve the issues with the customer data? Choose 3 answers. Use Salesforce CDC to sync customer data across all systems to keep customer record in sync. Invest in data duplication tool to de-dupe and merge duplicate records across all systems. Create a unique global customer ID for each customer and store that in all systems for referential identity. Duplicate customer records across the systems and provide a two-way sync of data between the systems. Create a customer master database external to Salesforce as a system of truth and sync the customer data with all systems. Universal Containers (UC) has the following systems: 1. Billing system 2. Customer support system 3. CRM system UC has been having trouble with business intelligence across the different systems. Recently UC implemented a master data management (MDM) solution that will be the system of truth for the customer records. Which MDM data element is needed to allow reporting across these systems?. Globally Unique Identifier. Phone number. Full name. Email address. Universal Containers (UC) has a very large and complex Salesforce org with hundreds of validation rules and triggers. The triggers are responsible for system updates and data manipulation as records are created or updated by users. A majority of the automation tools within UC's org were not designed to run during a data load. UC is importing 100,000 records into Salesforce across several objects over the weekend. What should a data architect do to mitigate any unwanted results during the import?. Bulkify the triggers to handle import loads. Ensure validation rules, triggers, and other automation tools are disabled. Ensure duplication and matching rules are defined. Import the data in smaller batches over a 24-hour period. Universal Containers (UC) is in the process of selling half of its company. As part of this split, UC's main Salesforce org will be divided into two orgs: Org A and Org B. UC has delivered these requirements to its data architect: 1. The data model for Org B wil drastically change with different objects, fields, and picklist values. 2. Three million records will need to be migrated from Org A to Org B for compliance reasons. 3. The migration wil need occur within the next two months, prior to the spilt. Which migration strategy should a data architect use to successfully migrate the data?. Use the Salesforce CLI to query, export, and import. Use an ETL tool to orchestrate the migration. Use Data Loader for export and Data Import Wizard for import. Write a script to use the Bulk API. Northern Trail Outfitters (NTO) runs its entire business out of an enterprise data warehouse (EDW). NO's sales team is starting to use Salesforce after a recent implementation, but currently lacks the data required to advance leads and opportunities to the next stage. NTO's management has researched Salesforce Connect and would like to use it to virtualize and report on data from the EDW within Salesforce. NTO will be running thousands of reports per day across 10 to 15 external objects. What should a data architect consider before implementing Salesforce Connect for reporting?. Maximum external objects per org. OData callout limits per day. Maximum page size for server-driven paging. Maximum number of records returned. Universal Container has implemented Sales Cloud to manage patient and related health records. During a recent security audit of the system, it was discovered that some standard and custom fields need to encrypted. Which solution should a data architect recommend to encrypt existing fields?. Use Apex Crypto Class to encrypt custom and standard fields. Export data out of Salesforce and encrypt custom and standard fields. Implement shield platform encryption to encrypt custom and standard fields. Implement classic encryption to encrypt custom and standard fields. Northern Trail Outfitters has these simple requirements for a data export process: 1. File format should be in CSV. 2. Process should be scheduled and run once per week. 3. The export should be configurable through the Salesforce UI. Which tool should a data architect leverage to accomplish these requirements?. Data Export Wizard. Bulk API. Data Loader. Third-party ETL tool. A large insurance provider is looking to implement Salesforce. The following conditions exist: 1. Multiple channels for lead acquisition 2. Duplicate leads across channels 3. Poor customer experience and higher costs On analysis, it was found that there are duplicate leads that are resulting in multiple quotes and opportunities. Which three actions should a data architect recommend to mitigate the issues? Choose 3 answers. Implement third-party solution to clean and enrich lead data. Implement de-duplication strategy to prevent duplicate leads. Build process to manually search and merge duplicates. Standardize lead information across al channels. Build a custom solution to identify and merge duplicate leads. Universal Containers (UC) has accumulated data over years and has never deleted data from its Salesforce org. UC is now exceeding the storage allocations in the org. UC is now looking for options to delete unused records from the org. Which three recommendations should a data architect make in order to reduce the number of records from the org? Choose 3 answers. Use hard delete in Bulk API to permanently delete records from Salesforce. Identify records in objects that have not been modified or used in last 3 years. Archive the records in enterprise data warehouse (EDW) before deleting from Salesforce. Use Rest API to permanently delete records from the Salesforce org. Use hard delete in Batch Apex to permanently delete records from Salesforce. Universal Containers (UC) stores 10 million rows of inventory data in a cloud database. As part of creating a connected experience in Salesforce, UC would like to expose this inventory data to Sales Cloud without a direct import. UC has asked its data architect to determine if Salesforce Connect is needed. Which three considerations should the data architect make when evaluating the need for Salesforce Connect? Choose 3 answers. You need to expose data via a virtual private connection. You need small amounts of external data at any one time. You have a large amount of data and would like to copy subsets of it into Salesforce. You want real-time access to the latest data from other systems. You have a large amount of data that you don't want to copy into your Salesforce org. Universal Containers has a legacy client server app that has a relational database that needs to be migrated to Salesforce. What are the three key actions that should be done when data modeling in Salesforce? Choose 3 answers. Map legacy data to Salesforce objects. Identify data elements to be persisted in Salesforce. Map legacy data to custom metadata types. Implement the legacy data model within Salesforce using custom fields. Work with legacy application owner to analyze the legacy data model. Universal Containers (UC) would like to build a Human Resources application on Salesforce to manage employee details, payroll, and hiring efforts. To adequately maintain and store the relevant data, the application will need to leverage 45 custom objects. In addition to this, UC expects roughly 20,000 API calls into Salesforce from an on-premise application daily. Which license type should a data architect recommend that best fits these requirements?. Lightning External Apps Starter. Lightning Platform Starter. Lightning Platform Plus. Service Cloud. Universal Containers (UC) is in the process of implementing an enterprise data warehouse (EDW). UC needs to extract 100 million records from Salesforce for migration to the EDW. What data extraction strategy should a data architect use for maximum performance?. Call the REST API in successive queries. Utilize PK Chunking with the Bulk API. Install a third-party AppExchange tool. Use the Bulk API in parallel mode. Northern Trail Outfitters (NTO) wants to implement backup and restore for Salesforce data. Currently, it has data backup processes that runs weekly, which backs up all Salesforce data to an enterprise data warehouse (EDW). NTO wants to move to daily backups and provide restore capability to avoid any data loss in case of outage. What should a data architect recommend for a daily backup and restore solution?. Change weekly backup process to daily backup, and implement a custom restore solution. Use AppExchange package for backup and restore. Use ETL tool for backup and restore from EDW. Use Bulk API to extract data on daily basis to EDW and REST API for restore. Universal Containers (UC) has one Salesforce org (Org A) and recently acquired a secondary company with its own Salesforce org (Org B). UC has decided to keep the orgs running separately, but would like to bidirectionally share Opportunities between orgs in near-real time. Which three options should a data architect recommend to share data between Org A and Org B? Choose 3 answers. Install a third-party AppExchange tool to handle the data sharing. Develop an Apex class that pushes Opportunity data between orgs daily via the Apex scheduler. Leverage Heroku Connect and Heroku Postgres to bidirectionally sync Opportunities. Use Salesforce Connect and the cross-org adapter to virtualize Opportunities into external objects. Leverage middleware tools to bidirectionally send Opportunity data across orgs. Universal Containers is migrating 100,000 accounts from an enterprise resource planning (ERP) to Salesforce and is concerned about ownership skew and performances. Which three recommendations should a data architect provide to prevent ownership skew? Choose 3 answers. Keep users out of public groups that can be used as the source for sharing rules. Assign View All permissions to a group of users for the Account object. Assign a default user as owner of accounts and assign them a role in the hierarchy. Assign a default user as owner of accounts and do not assign any role to the default user. Assign a default user as owner of accounts and assign the user to the top most role in the hierarchy. Universal Containers requires all customers to provide either a phone number or an email address when registering for an account. What should the data architect use to ensure this requirement is met?. Apex Class. Validation Rule. Required Fields. Process Builder. Universal Containers (UC) has a requirement to migrate 100 million order records from a legacy enterprise resource planning (ERP) application into the Salesforce platform. UC does not have any requirements around reporting on the migrated data. What should a data architect recommend to reduce performance degradation of the platform?. Use the standard "Order" object to store the data. Use a standard big object defined by Salesforce. Implement a custom big object to store the data. Create a custom object to store the data. United Containers (UC) has released a new disaster recovery (DR) policy that states that cloud solutions need a business continuity plan in place separate from the cloud provider's built-in data recovery solution. Which solution should a data architect use to comply with the DR policy?. Utilize an ETL tool to migrate data to an on-premise archival solution. Leverage a third-party tool that extracts Salesforce data/metadata, and stores the information in an external protected system. Write a custom batch job to extract data changes nightly, and store on an external protected system. Leverage Salesforce weekly exports, and store data in flat files on a protected system. Universal Containers is implementing Sales Cloud for patient management and would like to encrypt sensitive patient records being stored in files. Which solution should a data architect recommend to solve this requirement?. Implement Shield Platform Encryption to encrypt files. Implement a third-party AppExchange app to encrypt files. Use classic encryption to encrypt files. Store files outside of Salesforce and access them in real time. Universal Containers is migrating individual customers (B2C) data from legacy systems to Salesforce. There are millions of customers stored as accounts and contacts in the legacy database. Which object model should a data architect configure within Salesforce?. Leverage the standard account and contact object in Salesforce. Leverage the person account object in Salesforce. Leverage a custom account and contact object in Salesforce. Leverage a custom person account object in Salesforce. Universal Containers (UC) is implementing Salesforce and will be using Salesforce to track customer complaints, provide white papers on products, and provide subscription based support. Which license type will UC users need to fulfill UC's requirements?. Salesforce License. Service Cloud License. Sales Cloud License. Lightning Platform Starter License. Northern Trail Outfitters (NTO) wants to start a loyalty program to reward repeat customers. The program will track every item a customer has bought and grants them points for discounts. The following conditions will exist upon implementation: 1. Data will be used to drive marketing and product development initiatives. 2. NTO estimates that the program will generate 100 million rows of data monthly. 3. NTO will use Salesforce's Einstein Analytics and Discovery to leverage their data and make business and marketing decisions. What should the Data Architect do to store, collect, and use the reward program data?. Have Einstein connect to the point of sales system to capture the Reward Program data. Create a big object in Einstein Analytics to capture the Loyalty Program data. Create a custom object in Salesforce that will be used to capture the Reward Program data. Create a custom big object in Salesforce which will be used to capture the Reward Program data for consumption by Einstein. Northern Trail Qutfitters needs to implement an archive solution for Salesforce data. This archive solution needs to help NTO do the following: 1. Remove outdated information not required on a day-to-day basis. 2. Improve Salesforce performance. Which solution should be used to meet these requirements?. Create a full copy sandbox, and use it as a source for retaining archived data. Identify a location to store archived data, and move data to the location using a time-based workflow. Identify a location to store archived data and use scheduled batch jobs to migrate and purge the aged data on a nightly basis. Use a formula field that shows true when a record reaches a defined age and use that field to run a report and export a report into SharePoint. Northern Trail Outfitters would like to report on the type of customers. A custom field for customer type was created in Account object. Users need to be limited to the following defined choices when entering information in this field: 1. High Value 2. Medium Value 3. Low Value Which strategy should a data architect recommend to configure customer type?. Single-select restricted picklist with defined choices. Create a validation rule to limit entry to defined choices. Lookup to a custom object with defined choices. Provide help text to guide users with defined choices. Universal Containers uses Apex jobs to create leads in Salesforce. The business team has requested that lead creation failures should be reported to them. Which option does Apex provide to report errors from this Apex job?. Use Apex services to email failures to business when error occurs. Save Apex errors in a custom object, and allow access to this object for reporting. Use AppExchange package to clean lead information before Apex job processes them. Use Custom Object to store leads, and allow unprocessed leads to be reported. Universal Containers (UC) has decided that they are going to retire a homegrown CRM application and move to Salesforce. As part of the implementation, they will need to migrate 5 billion records in a series of batches. The records are a collection of Accounts, Contacts, Opportunities, Products, and Orders. Since the legacy CRM application and Salesforce have different data models, the data will need to be transformed and normalized before it is loaded into the system. What should a data architect recommend as a migration and storage strategy?. Migrate data to a centralized datastore first with ETL tools. Import all data into custom objects with data loader. Use an ETL tool to extract and transform the data, then load into standard objects. Import all data into custom objects with middleware. A national nonprofit organization is using Salesforce to recruit members. The recruitment process requires a member to be matched with a volunteer opportunity. Given the following: 1. A record is created in Project c and used to track the project through completion. 2. The member may then start volunteering and is required to track their volunteer hours, which is stored in VTOTime__c object related to the project. 3. Ability to view or edit the VTOTime__c object needs to be the same as the Project__c record. 4. Managers must see total hours volunteered while viewing the Project__c record. Which data relationship should the data architect use to support this requirement when creating the custom VTOTime__c object?. Master Detail field on Project__c to VTOTime_c showing a list of VTOTime__c Records in a related list. Master Detail Field on VTOTime__c to Project__c with rollup summary field on Project__c showing sum of hours from VTOTime__c records. Lookup Field on Project__c to VTOTime__c displaying a list of VTOTime__c in a related list. LookUp field on VTOTime__c to Project__c with formula filed on Project__c showing Sum of hours from VTOTime__c records. Universal Containers has provided a web order form for its customer and has noticed invalid data coming in on orders. What should be used to mitigate this problem?. Work Flow Rules. Validation Rules. Apex Trigger. Formatted Fields. Universal Containers (UC) has implemented a master data management strategy, which uses a central system of truth, to ensure the entire company has the same customer information in all systems. UC customer data changes need to be accurate at all times in all of the systems. Salesforce is the identified system of record for this information. What is the correct solution for ensuring all systems using customer data are kept up to date?. Send customer data nightly to the system of truth in a scheduled batch jot. Send customer record changes from Salesforce to the system of truth in real time. Have each system pull the record changes from Salesforce using change data capture. Send customer record changes from Salesforce to each system in a nightly batch job. Universal Containers is establishing a call center that will use Salesforce. UC receives 10 million calls and creates 100 million cases every month. Cases are linked to a custom call object using lookup relationship. UC would like to run reports and dashboard to better understand different case types being created on calls to better serve customers. What solution should a data architect recommend to meet the business requirement?. Leverage custom objects to store aggregate data and run analytics. Leverage out-of-the-box reports and dashboard on case and interactive voice response (IVR) custom object. Archive records to a data warehouse and run analytics on the data warehouse. Leverage big objects to archive records and Einstein Analytics to run reports. Universal Containers (UC) is migrating from an on-premise homegrown customer relationship management (CRM) system. During analysis, UC users highlight a pain point that there are multiple versions of many customers. What should the data architect do for a successful migration to mitigate the pain point?. Hire an intern manually de-duplicate the records after migrating to Salesforce. Store the data in a staging database, and de-duplicate identical records. Have the users manually clean the data in the old system prior to migration. Migrate the data as is, and use Salesforce's de-duplicating feature. Which API should a data architect use if exporting 1 million records from Salesforce?. REST API. SOAP API. Streaming API. Bulk API. Northern Trail Outfitters (NTO) has multiple Salesforce orgs based on regions. Users need read-only access to customers across all Salesforce orgs. Which feature in Salesforce can be used to provide access to customer records across all NTO orgs?. External APIs. Federated Search. Salesforce Connect. Salesforce 2 Salesforce. A consumer products company has decided to use Salesforce for its contact center. The contact center agents need access to the following information in Service Console when a customer contacts them: 1. Customer browsing activity on its website stored on its on-premise system 2. Customer interactions with sales associates at its retail stores maintained in Salesforce 3. Contact center interactions maintained in Salesforce 4. Email campaign activity to customer from its marketing systems. What should a data architect do to fulfill these requirements with minimum development effort in Salesforce?. Create web tabs in Service Console to show website and marketing activities. Use Salesforce Connect to integrate the website and the marketing system into Service Console using external objects. Build custom components in Service Console to bring information from the marketing and website information. Build customer view in Service Console with components that show website data and marketing data as mashup. Northern Trail Outfitters (NTO) has an external product master system that syncs product and pricing information with Salesforce. Users have been complaining that they are seeing discrepancies in product and pricing information displayed on the NTO website and Salesforce. As a data architect, which action is recommended to avoid data sync issues?. Use the Customer 360 data manager to sync product and pricing information from product master database to Salesforce. Implement a manual process to update the products from an extract from the products master on a weekly basis. Build a custom integration for one way sync of product and pricing information from product master to Salesforce. Build a custom integration for two-way sync of product and pricing information between product master to Salesforce. Northern Trail Outfitters is planning to build a consent form to record customer authorization for marketing purposes. What should a data architect recommend to fulfill this requirement?. Create a custom object to maintain the authorization. Utilize the Authorization Form Consent object to capture the consent. Use custom fields to capture the authorization details. Use AppExchange solution to address the requirement. Universal Containers (UC) needs to aggregate monthly and yearly metrics using standard reports. UC's monthly and yearly details are stored in custom objects with four million monthly records and nine million yearly records. The reports are aggregating millions of records across the two objects and taking long time to return results. What solution should a data architect recommend to improve report performance?. Delete old data from Salesforce and run reports on it. Create an aggregation custom object that summarized the monthly and yearly values. Move data out of Salesforce and run reporting on it. Leverage big object to store data and run reports on it. Northern Trail Outfitters (NTO) wants to capture a list of customers that have bought a particular product. The solution architect has recommended to create a custom object for product, and to create a lookup relationship between its customers and its products. Products will be modeled as a custom object (NTO_Product__c) and customers are modeled as person accounts. Every NTO product may have millions of customers looking up a single product, resulting in a lookup skew. What should a data architect suggest to mitigate issues related to lookup skew?. Create multiple similar products and distribute the skew across those products. Change the lookup relationship to master-detail relationship. Select Clear the value of this field option while configuring the lookup relationship. Create a custom object to maintain the relationship between products and customers. Universal Containers has multiple systems all containing and maintaining customer data. Although point-to-point integrations are in place, customers are complaining about consistency in the data. What solution should the data architect recommend?. An MDM solution as the customer master, with centralized integrations to ensure consistency across all systems. Perform a one time synchronization to level set the built up inconsistencies. Data cleanse each system. Improve existing point-to-point integrations. Northern Trail Outfitters (NTO) is streaming IOT data from connected devices to a cloud database. Every 24 hours, 100,000 records are generated. NTO employees will need to see these IT records within Salesforce and generate weekly reports on it. Developers may also need to write programmatic logic to aggregate the records and incorporate them into workflows. Which data pattern will allow a data architect to satisfy these requirements, while also keeping limits in mind?. Virtualization. Bidirectional integration. Persistence. Unidirectional integration. Universal Containers (UC) is building a Service Cloud call center application and has a multi-system support solution. UC would like to ensure that all systems have access to the same customer information. What solution should a data architect recommend?. Load customer data in all systems. Let each system be an owner of data it generates. Make Salesforce the system of record for all data. Implement a master data management (MDM) strategy for customer data. Northern Trail Outfitters (NTO) plans to maintain contact preferences for customers and employees. NTO has implemented the following: 1. Customers are Person Accounts for their retail business. 2. Customers are represented as Contacts for their commercial business. 3. Employees are maintained as Users. 4. Prospects are maintained as Leads. NTO needs to implement a standard communication preference management model for Person Accounts, Contacts, Users, and Leads. Which option should the data architect recommend NTO to satisfy this requirement?. Create custom fields for contact preferences in Lead, Person Account, and Users objects. Create a custom object to maintain preferences and build relationships to Lead, Person Account, and Users. Create case for contact preferences, and use this to validate the preferences for Lead, Person Accounts, and Users. Use Individual objects to maintain the preferences with relationships to Lead, Person Account, and Users. Universal Containers's system administrators have been complaining that they are not able to make changes to its users' records, including moving them to new territories without getting "unable to lock row" errors. This is causing the system admins to spend hours updating user records every day. What should the data architect do to prevent the errors?. Reduce number of users updated concurrently. Increase CPU for the Salesforce org. Analyze Splunk query to spot offending records. Enable granular locking. Northern Trail Outfitters (NTO) has recently implemented Salesforce to track opportunities across all their regions. NTO sales teams across all regions have historically managed their sales process in Microsoft Excel. NTO sales teams are complaining that their data from the Excel files were not migrated as part of the implementation and NTO is now facing low Salesforce adoption. What should a data architect recommend to increase Salesforce adoption?. Define a standard mapping and train sales users to import opportunity data. Create a chatter group and upload all Excel files to the group. Use the Excel connector to Salesforce to sync data from individual Excel files. Load data in external database and provide access to database to sales users. Universal Containers has implemented Salesforce for its operations. In order for customers to be created in their MDM solution, the customer record needs to have the following attributes: 1. First Name 2. Last Name 3. Email Which option should the data architect recommend to mandate this when customers are created in Salesforce?. Configure Page Layout marking attributes as required fields. Create validation rules to check if the required attributes are entered. Mark Fields for the attributes as required under Setup. Build validation in Integration with MDM to check required attributes. Universal Containers (UC) is migrating data from legacy system to Salesforce. During data analysis it was discovered that data types of fields being migrated do not match with Salesforce data types. Which solution should a data architect use to ensure successful data migrations. Export legacy data into CSV files and leverage data loader to load data into Salesforce. Export legacy data into the staging database and leverage stored procedures to transform data types before loading into Salesforce. Migrate the legacy data leveraging an ETL tool to transform data types and load data into Salesforce. Migrate legacy data to a staging database for mapping then leverage an ETL tool to transform the data and load into Salesforce. Universal Containers (UC) has millions of case records with case history and service level agreement data. UC's compliance team would like historical cases to be accessible for 10 years for audit purposes. What solution should a data architect recommend?. Use a custom big object to store archived case data. Purchase more data storage to support the case object. Archive case data using a Salesforce Archival solution. Use a custom object to store archived case data. Universal Containers (UC) is implementing Salesforce for lead management. UC procures lead data from multiple sources and would like to make sure lead data has company profile and location information. Which solution should a data architect recommend to make sure lead data has both profile and location information?. Leverage external data providers to populate company profile and location data. Run reports to identify records which do not have company profile and location data. Export data out of Salesforce and send to another team to populate company profile and location data. Ask sales people to search for and populate company profile and location data. Northern Trail Outfitters has decided that it is going to build a channel sales portal with the following requirements: 1. External resellers are able to authenticate to the portal with a login. 2. Lead data, Opportunity data, and Order data are available to authenticated users. 3. Authenticated users may need to run reports and dashboards. 4. There is no need for more than 10 custom objects or additional file storage. Which Community Cloud license type should a data architect recommend to meet the portal requirements?. Customer Community Plus. Customer Community. Partner Community. Lightning External Apps Starter. Northern Trail Outfitters (NTO) has implemented Salesforce for its sales users. The opportunity management in Salesforce is implemented as follows: 1. Sales users enter their opportunities in Salesforce for forecasting and reporting purposes. 2. NTO has a product pricing system (PPS) that is used to update the Opportunity Amount field on opportunities on a daily basis 3. PPS is the trusted source within NTO for Opportunity Amount. 4. NTO uses Opportunity Forecast for its sales planning and management. Sales users have noticed that their updates to the Opportunity Amount field are overwritten when PPS updates their opportunities. How should a data architect address this overwriting issue?. Create a custom field for Opportunity amount that PPS updates separating the field sales user updates. Change PPS integration to update only Opportunity Amount field when the value is null. Create a custom field for Opportunity Amount that sales users update separating the field that PPS updates. Change Opportunity Amount field access to Read Only for sales users using field-level security. Universal Containers (UC) owns multiple Salesforce orgs that are distributed across regional branches. Each branch stores local customer data inside of its org's Account and Contact objects. This creates a scenario where UC is unable to view customers across all orgs. UC has an initiative to create a 360 degree view of the customer, as UC would like to see Account and Contact data from all of its orgs in one place. Which solution should a data architect suggest to achieve a 360 degree view of the customer?. Consolidate the data from each org into a centralized datastore. Use the Salesforce Connect cross-org adapter. Build a bidirectional integration between all orgs. Use an ETL tool to migrate gap Accounts and Contacts into each org. Universal Containers has a Sales Cloud implementation for a sales team and an enterprise resource planning (ERP) as a customer master. Sales teams are complaining about duplicate account records and data quality issues with account data. Which solution should a data architect recommend to resolve the complaints?. Build a nightly sync job from ERP to Salesforce. Do a bulk delete of all account records in Salesforce and a complete reload from the ERP. Implement a de-dupe solution and establish account ownership in Salesforce. Build a nightly batch job to de-dupe data, and merge account records. A large healthcare provider wishes to use Salesforce to track patient care. The following actors are in Salesforce: 1. Payment providers: Organizations who pay for the care to patients. 2. Doctors: They provide care plans for patients and need to support multiple patients. They are provided access to patient information. 3. Patients: These are individuals who need care. A data architect needs to map the actors to Salesforce objects. What should be the optimal selection by the data architect?. Patients as Contacts, Payment providers as Accounts, and Doctors as Accounts. Patients as Person Accounts, Payment providers as Accounts, and Doctors as Person Accounts. Patients as Accounts, Payment providers as Accounts, and Doctors as Person Accounts. Patients as Person Accounts, Payment providers as Accounts, and Doctors as Contacts. Universal Containers is building a Salesforce application to track Contacts and the respective Conferences that they have attended with the following requirements: 1. Contacts will be stored in the standard Contact object. 2. Conferences will be stored in a custom Conference__c object. 3. Each Contact may attend multiple Conferences and each Conference may be related to multiple Contacts. How should a data architect model the relationship between the Contact and Conference__c objects?. Create a lookup relationship field on the Contact object. Create a master-detail relationship field on the Conference__c object. Implement a Contact Conference junction object with master-detail relationships to both Contact and Conference__c. Create a master-detail relationship field on the Contact object. Northern Trail Outfitters (NTO) needs to extract 50 million records from a custom object every day from its Salesforce org. NTO is facing query time-out issues while extracting these records. What should a data architect recommend in order to get around the time-out issue?. Use the Rest API to extract data as it automatically chunks records by 200. Use a PK Chunking within the Bulk API. Use extract, transform, load (ETL) tool for extraction of records. Ask Salesforce support to increase the query timeout value. What should a data architect do to provide additional guidance for users when they enter information in a Standard field?. Add custom help text in default value for the field. Add a label field with help text adjacent to the custom field. Provide custom help text under field properties. Create a custom page with help text for user guidance. Universal Containers is planning out their archiving and purging plans going forward for their custom objects Topic__c and Comment__c. Several options are being considered, including analytics snapshots, offsite storage, scheduled purges, etc. Which three questions should be considered when designing an appropriate archiving strategy?. Will the data being archived need to be reported on or accessed in any way in the future?. How many fields are defined on the custom objects that need to be archived?. If reporting is necessary, can the information be aggregated into fewer, summary records?. Are there any regulatory restrictions that will influence the archiving and purging plans?. Which profiles and users currently have access to these custom object records?. Universal Containers (UC) is facing data quality issues where Sales Reps are creating duplicate customer accounts, contacts, and leads. UC wants to fix this issue immediately by prompting users about a record that possibly exists in Salesforce. UC wants a report regarding duplicate records. What would be the recommended approach to help UC start immediately?. Create a before insert and update trigger on account, contact, and lead, and send an error if a duplicate is found using a custom matching criteria. Create an after insert and update trigger on the account, contact and lead, and send an error if a duplicate is found using a custom matching criteria. Create a duplicate rule for account, lead, and contact, use standard matching rules for these objects, and set the action to block for both creates and edits. Create a duplicate rule for account, lead, and contact, use standard matching rules for these objects, and set the action to report and alert for both creates and edits. Universal Containers (UC) wants to ensure their data on 100,000 Accounts pertaining mostly to US-based companies is enriched and cleansed on an ongoing basis. UC is looking for a solution that allows easy monitoring of key data quality metrics. What should be the recommended solution to meet this requirement?. Use a declarative approach by installing and configuring Data.com Clean to monitor Account data quality. Implement Batch Apex that calls out a third-party data quality API in order to monitor Account data quality. Use declarative approach by installing and configuring Data.com Prospector to monitor Account data quality. Implement an Apex Trigger on Account that queries a third-party data quality API to monitor Account data quality. Universal Containers (UC) has implemented Sales Cloud and it has been noticed that Sales reps are not entering enough data to run insightful reports and dashboards. UC executives would like to monitor and measure data quality metrics. What solution addresses this requirement?. Use custom objects and fields to calculate data quality. Generate reports to view the quality of sample data. Export the data to an enterprise data warehouse and use BI tools for data quality. Use third-party AppExchange tools to monitor and measure data quality. DreamHouse Realty has an integration that creates records in a Salesforce Custom Object. The Custom Object has a field marked as required on the page layout. DreamHouse Realty has noticed that many of the records coming from the external system are missing data in this field. The Architect needs to ensure this field always contains data coming from the source system. Which two approaches should the Architect take? Choose 2 answers. Set up a Validation Rule to prevent blank values. Blame the customer's external system for bad data. Create a Workflow to default a value into this field. Mark the field required in setup at the field level. NTO has multiple systems across its enterprise landscape including salesforce, with disparate version the customer records. In Salesforce, the customer is represented by the contact object. NTO utilizes an MDM solution with these attributes: 1. The MDM solution keeps track of customer master with a master key. 2. The master key is a map to the record ID's from each external system that customer data is stored within. 3. The MDM solution provides de-duplication features, so it acts as the single source of truth. How should a data architect implement the storage of master key within salesforce?. Store the Master Key in Heroku Postgres and use Heroku Connect for synchronization. Create a custom object to store the Master Key with a lookup field to Contact. Create an external object to store the Master Key with a lookup field to Contact. Store the Master Key on the Contact object as an External ID field for referential integrity. All accounts and opportunities are created in Salesforce. Salesforce is integrated with three systems: * An ERP system feeds order data into Salesforce and updates both Account and Opportunity records. * An accounting system feeds invoice data into Salesforce and updates both Account and Opportunity records. * A commission system feeds commission data into Salesforce and updates both Account and Opportunity records. How should the architect determine which of these systems is the system of record?. Account and opportunity data originates in Salesforce, and therefore Salesforce is the system of record. Whatever system updates the attribute or object should be the system of record for that field or object. Whatever integration data flow runs last will, by default, determine which system is the system of record. Data flows should be reviewed with the business users to determine the system of record per object or field. Universal Container (UC) has around 200,000 Customers (stored in Account object). They get 1 or 2 Orders every month from each Customer. Orders are stored in a custom object called "Order__c"; this has about 50 fields. UC is expecting a growth of 10% year-over-year. What are two considerations an architect should consider to improve the performance of SOQL queries that retrieve data from the Order__c object? Choose 2 answers. Make the queries more selective using indexed fields. Work with Salesforce Support to enable Skinny Tables. Use SOQL queries without WHERE conditions. Reduce the number of triggers on Order__c object. Universal Containers (UC) has 50 million customers and stores customer order history on an ERP system. UC also uses Salesforce to manage opportunities and customer support. In order to provide seamless customer support, UC would like to see the customers order history when viewing the customer record during a sales or support call. What should a data architect do in order to provide this functionality, while preserving the user experience?. Use an Apex callout to populate a text area field for displaying the order history. Import the order history into a custom Salesforce object, update nightly. Use Salesforce Connect and an external object to display the order history in Salesforce. Embed the ERP system in an iframe and display on a custom tab. Universal Containers (UC) has migrated its back-office data into an on-premise database with REST API access. UC recently implemented Sales Cloud for its sales organization, but users are complaining about a lack of order data inside Salesforce. UC is concerned about Salesforce storage limits but would still like Sales Cloud to have access to the data. Which design pattern should a data architect select to satisfy the requirement?. Develop a bidirectional integration between the on-premise system and Salesforce. Migrate and persist the data in Salesforce to take advantage of native functionality. Build a UI for the on-premise system and iframe it in Salesforce. Use Salesforce Connect to virtualize the data in Salestorce and avoid storage limits. A customer needs a sales model that allows the following: 1. Opportunities need to be assigned to sales people based on the zip code. 2. Each salesperson can be assigned to multiple zip codes. 3. Each zip code is assigned to a sales area definition. Sales are aggregated by sales area for reporting. What should a data architect recommend?. Configure the Territory Management feature to support opportunity assignment. Assign opportunities using list views using zip code. Allow sales users to manually assign opportunity ownership based on zip code. Add custom fields in opportunities for zip code and use assignment rules. A large telecommunication provider that provides internet services to both residences and businesses has the following attributes: 1. A customer who purchases its services for their home will be created as an Account in Salesforce. 2. Individuals within the same house address will be created as Contact in Salesforce 3. Businesses are created as Accounts in Salesforce. Some of the customers have both services at their home and business. What should a data architect recommend for a single view of these customers without creating multiple customer records?. Customers are created as Individual objects and relate with Accounts for Business and Residence accounts. Customers are created as Person Accounts and related to Business and Residential Accounts using the Account Contact relationship. Customers are created as Contacts and related to Business and Residential Accounts using the Account Contact Relationships. Customers are created as Accounts for Residence Account and use Parent Account to relate Business Account. Universal Containers has been a customer of Salesforce for 10 years. Currently they have 2 million accounts in the system. Due to an erroneous integration built 3 years ago, it is estimated there are 500,000 duplicates in the system. Which solution should a data architect recommend to remediate the duplication issue?. Utilize a data warehouse as the system of truth. Extract the data using data loader and use excel to merge the duplicate records. Implement duplicate rules. Develop an ETL process that utilizes the merge API to merge the duplicate records. Universal Containers (UC) recently migrated 1 billion customer related records from a legacy datastore to Heroku Postgres. A subset of the data needs to be synchronized with Salesforce so that service agents are able to support customers directly within the service console. The remaining non-synchronized set of data will need to be accessed by Salesforce at any point in time, but UC management is concerned about storage limitations What should a data architect recommend to meet these requirements with minimal effort?. As needed, make callouts into Heroku Postgres and persist the data in Salesforce. Migrate the data to big objects and leverage Async SOQL with custom objects. Use Heroku Connect to bidirectionally sync all data between systems. Virtualize the remaining set of data with Salesforce Connect and external objects. Northern Trail Outfitters would like to retrieve their Salesforce org's metadata programmatically for backup within a version control system. Which API is the best fit for accomplishing this task?. Metadata API. SOAP API. Tooling API. Bulk API in serial mode. Universal Containers (UC) needs to run monthly and yearly reports on opportunities and orders for sales reporting. There are 5 million opportunities and 10 million orders. Sales users are complaining that the report will regularly timeout. What is the fastest and most effective way for a data architect to solve the time-out issue?. Utilize CRM Analytics to run analytical reporting for the large data objects. Create custom fields on opportunity, and copy data from order into those custom fields and run all reports on Opportunity object. Create a skinny table in Salesforce, and copy order and opportunity fields into the skinny table and create the required reports on it. Extract opportunity and order data from Salesforce, and use a third-party reporting tool to run reports outside of Salesforce. A data architect is working with a large B2C retailer and needs to model the consumer account structure in Salesforce. What standard feature should be selected in this scenario?. Account Contact. Person Accounts. Individual Accounts. Contacts. Northern Trail Outfitters is concerned because some of its data is sensitive and needs to be identified for access. What should be used to provide ways to filter and identify the sensitive data?. Custom checkbox denoting sensitive data. Define data classification metadata. Define data grouping metadata. Implement field-level security. Universal Containers uses classic encryption for custom fields and is leveraging weekly data export for data backups. During a validation process, UC discovered that encrypted field values are still being exported as part of the data export. What should a data architect recommend to make sure decrypted values are exported during data export?. Create another field to copy data from the encrypted field, and use this field in export. Set up a custom profile for the data migration user, and assign View Encrypted Data. Leverage an Apex class to decrypt data before exporting it. Set up a standard profile for the data migration user, and assign View Encrypted Data. Northern Trail Outfitters (NTO) has decided to franchise its brand. Upon implementation 1,000 franchisees will be able to access NTO's product information and track large customer sales and opportunities through a portal. The franchisees will also be able to run monthly and quarterly sales reports and projections as well as view the reports in dashboards. Which license does NTO need to provide these features to the franchisees?. Salesforce Sales Cloud License. Lighting Platform License. Partner Community License. Customer Community License. A casino is implementing Salesforce and is planning to build a customer 360 degree view for a customer who visits its resorts. The casino currently maintains the following systems that record customer activity: 1. Point-of-sale system: All purchases for a customer 2. Salesforce: All customer service activity and sales activities for a customer 3. Mobile app: All bookings, preferences, and browser activity for a customer 4. Marketing: All email, SMS, and social campaigns for a customer Customer service agents using Salesforce would like to view the activities from all four systems to provide support to customers. The information has to be current and real time. What strategy should the data architect implement to satisfy this requirement?. Periodically upload summary information in Salesforce to build a 360 degree view. Explore external data sources in Salesforce to build a 360 degree view of the customer. Use a customer data mart to create the 360 degree view of the customer. Migrate customer activities from all four systems into Salesforce. Universal Containers (UC) needs to move millions of records from an external enterprise resource planning (ERP) system into Salesforce. What should a data architect recommend to be done while using the Bulk API in serial mode instead of parallel mode?. Inserting 1 million orders distributed across a variety of accounts with lock exceptions eliminated and managed. Leveraging a controlled feed load with 10 batches per job. Placing 20 batches on the queue for upsert jobs. Inserting 1 million orders distributed across a variety of accounts with potential lock exceptions. Universal Containers (UC) has built a custom application on Salesforce to help track shipments around the world. A majority of the shipping records are stored on-premise in an external data source. UC needs shipment details to be exposed to the custom application, and the data needs to be accessible in real time. The external data source is not OData enabled, and UC does not own a middleware tool. Which Salesforce Connect procedure should a data architect use to ensure UC's requirements are met?. Write an Apex class that makes a REST callout to the external API. Migrate the data to Heroku and register Postgres as a data source. Develop a process that calls an invocable web service method. Write a custom adapter with the Apex Connector Framework. Universal Containers (UC) owns several Salesforce orgs across a variety of business units. UC management has declared that it needs the ability to report on Accounts and Opportunities from each org in one place. Once the data is brought together into a global view, management would like to use advanced Al-driven analytics on the dataset. Which tool should a data architect recommend to accomplish this reporting requirement?. Run standard reports and dashboards. Install a third-party AppExchange tool for multi-org reporting. Write a Python script to aggregate and visualize the data. Use Einstein Analytics for multi-org. Based on government regulations, a Salesforce customer plans to implement the following in Salesforce for compliance: 1. Access to customer information based on record ownership 2. Ability for customers to request removal of their information from Salesforce 3. Prevent users from accessing Salesforce from outside company network (virtual private network, or VPN) What should a data architect recommend to address these requirements?. Implement Salesforce shield with Event Monitoring to address the requirement. Implement IP restrictions, sharing settings, and custom Apex to support customer requests. Allow users access to Salesforce through a custom web application hosted within VPN. Contact Salesforce support to restrict access only with VPN and other requirements. Universal Containers (UC) plans to implement consent management for its customers to be compliant with General Data Protection Regulation (GDPR). UC has the following requirements: 1. UC uses Person Accounts and Contacts in Salesforce for its customers. 2. Data Protection and Privacy is enabled in Salesforce. 3. Consent should be maintained in both these objects. 4. UC plans to verify the consent provided by customers before contacting them through email or phone. Which option should the data architect recommend to implement these requirements?. Build Custom Object to store consent information in Person Account and Contact; validate against this object before contacting customers. Use the Consent Management feature to validate consent provided under the Person Account and Contact that is provided by the customer. Delete contact information from customers who have declined consent to be contacted. Configure custom fields in Person Account and Contact to store consent provided by customers, and validate consent against the fields. Universal Containers (UC) manages Vehicle and Service History in Salesforce. Vehicle (Vehicle__c) and Service History (Service_History__c) are both custom objects related through a lookup relationship. Every week a batch synchronization process updates the Vehicle and Service History records in Salesforce. UC has two hours of migration window every week and is facing locking issues as part of the data migration process. What should a data architect recommend to avoid locking issues without affecting performance of data migration?. Change the lookup configuration to "Clear the value of this field" when lookup record is deleted. Use Bulk API serial mode for data migration. Use Bulk API parallel mode for data migration. Insert the order in another custom object and use Batch Apex to move the records to Service_Order_c object. Universal Containers (UC) has lead assignment rules to assign leads to owners. Leads not routed by assignment rules are assigned to a dummy user. Sales rep are complaining of high load times and issues with accessing leads assigned to the dummy user. What should a data architect recommend to solve these performance issues?. Assign dummy user last role in role hierarchy. Assign dummy user to highest role in role hierarchy. Create multiple dummy user and assign leads to them. Periodically delete leads to reduce number of leads. A large automobile manufacturer has decided to use Salesforce as its CRM. It needs to maintain the following dealer types in their CRM: 1. Local dealers 2. Regional distributor 3. State distributor 4. Service dealer The attributes are different for each of the customer types. The CRM users should be allowed to enter only attributes related to the customer types. The processes and business rules for each of the customer types could be different. How should the different dealers be maintained in Salesforce?. Use Accounts for dealers, and create record types for each of the dealer types. Create Custom objects for each dealer types and custom fields for dealer attributes. Create dealers as Accounts, and build custom views for each of the dealer types. Use Accounts for dealers and custom picklist field for each of the dealer types. |





