option
Cuestiones
ayuda
daypo
buscar.php

OtroTest4

COMENTARIOS ESTADÍSTICAS RÉCORDS
REALIZAR TEST
Título del Test:
OtroTest4

Descripción:
OtroTest4

Fecha de Creación: 2020/08/18

Categoría: Otros

Número Preguntas: 56

Valoración:(1)
COMPARTE EL TEST
Nuevo ComentarioNuevo Comentario
Comentarios
NO HAY REGISTROS
Temario:

Universal Containers is integrating a new Opportunity engagement system with Salesforce. According to their Master Data Management strategy, Salesforce is the system of record for Account, Contact, and Opportunity data. However, there does seem to be valuable Opportunity data in the new system that potentially conflicts with what is stored in Salesforce. What is the recommended course of action to appropriately integrate this new system?. Stakeholders should be brought together to discuss the appropriate data strategy moving forward. The Opportunity engagement system should become the system of record for Opportunity records. The MDM strategy defines Salesforce as the system of record, so Salesforce Opportunity values prevail in all conflicts. A policy should be adopted so that the system whose record was most recently updated should prevail in conflicts.

Universal Containers wants their Shipment custom object to always relate to a Container, a Sender, and a Receiver (all separate custom objects). If a Shipment is currently associated with a Container, Sender, or Receiver, deletion of those records should not be allowed. They also want separate sharing models on each custom object. What approach should an architect take to fulfill these requirements?. Create a Master -Detail relationship to each of the three parent records. Create a required Lookup relationship to each of the three parent records. Create two Master -Detail and one Lookup relationship to the parent records. Use a VLOOKUP formula field to associate the Shipment to each parent record.

Universal Containers (UC) is planning to move away from legacy CRM to Salesforce. As part of one-time data migration, UC will need to keep the original date when a contact was created in the legacy system. How should an Architect design the data migration solution to meet this requirement?. Create a new field on Contact object to capture the Created Date. Hide the standard CreatedDate field using Field -Level Security. Write an Apex trigger on the Contact object, before insert event to set the original value in a standard CreatedDate field. Enable "Set Audit Fields" and assign the permission to the user loading the data for the duration of the migration. After the data is migrated, perform an update on all records to set the original date in a standard CreatedDate field.

An architect has been asked to provide error messages when a future date is detected in a custom Birthdate_c field on the Contact object. The client wants the ability to translate the error messages. What are two approaches the architect should use to achieve this solution? (Choose 2 answers). Create a workflow field update to set the standard ErrorMessage field. Create a trigger on Contact and add an error to the record with a custom label. Create a validation rule and translate the error message with translation workbench. Implement a third -party validation process with translate functionality.

What is an advantage of using Custom metadata type over Custom setting?. Custom metadata records are editable in Apex. Custom metadata records are deployable using packages. Custom metadata records are not copied from production to sandbox. Custom metadata types are available for reporting.

The invoicing system at Universal Containers requires that attachments associated with the Invoice_c custom object be classified by Types (i.e., "Receipt," "Invoice PDF," etc.) so that reporting can be done on invoices showing the number of attachments grouped by Type. What approach should be taken to categorize the attachments to meet these requirements?. Create a custom picklist field for the Type on the standard Attachment object with the values. Create a custom object related to the Invoice object with a picklist field for the Type. Add additional options to the standard ContentType picklist field for the Attachment object. Add a ContentType picklist field to the Attachment layout and create additional picklist options.

Universal Containers has a legacy system that captures Conferences and Venues. These Conferences can occur at any Venue. They create hundreds of thousands of Conferences per year. Historically, they have only used 20 Venues. Which two things should the data architect consider when de-normalizing this data model into a single Conference object with a Venue picklist? (Choose 2 answers). Org data storage limitations. Bulk API limitations on picklist fields. Limitations on master -detail relationships. Standard list view in-line editing.

Universal Container (UC) has around 200,000 Customers (stored in Account object). They get 1 or 2 Orders every month from each Customer. Orders are stored in a custom object called "Order_c"; this has about 50 fields. UC is expecting a growth of 10% year-over-year. What are two considerations an architect should consider to improve the performance of SOQL queries that retrieve data from the Order_c object? (Choose 2 answers). Make the queries more selective using indexed fields. Work with Salesforce Support to enable Skinny Tables. Reduce the number of triggers on Order _c object. Use SOQL queries without WHERE conditions.

Universal Containers (UC) provides shipping services to its customers. They use Opportunities to track customer shipments. At any given time, shipping status can be one of the 10 values. UC has 200,000 Opportunity records. When creating a new field to track shipping status on opportunity, what should the architect do to improve data quality and avoid data skew?. Create a Lookup to custom object ShippingStatus c. Create a picklist field, values sorted alphabetically. Create a text field and make it an external ID. Create a Master -Detail to custom object ShippingStatus c.

Universal Containers (UC) management has identified a total of ten text fields on the Contact object as important to capture any changes made to these fields, such as who made the change, when they made the change, what is the old value, and what is the new value. UC needs to be able to report on these field data changes within Salesforce for the past 3 months. What are two approaches that will meet this requirement? (Choose 2 answers). Create a workflow to evaluate the rule when a record is created and use field update actions to store previous values for these ten fields in ten new fields. Create a Contact report including these ten fields and Salesforce Id, then schedule the report to run once a day and send email to the admin. Write an Apex trigger on Contact after insert event and after update events and store the old values in another custom object. LII Turn on field Contact object history tracking for these ten fields, than create reports on contact history.

Universal Containers (UC) has an open sharing model for its Salesforce users to allow all its Salesforce internal users to edit all contacts, regardless of who owns the contact. However, UC management wants to allow only the owner of a contact record to delete that contact. If a user does not own the contact, then the user should not be allowed to delete the record. How should the architect approach the project so that the requirements are met?. Set the Sharing settings as Public Read Only for the Contact object. Create a "before delete" trigger to check if the current user is not the owner. Create a validation rule on the Contact object to check if the current user is not the owner. Set the profile of the users to remove delete permission from the Contact object.

Universal Containers (UC) uses Salesforce for tracking opportunities (Opportunity). UC uses an internal ERP system for tracking deliveries and invoicing. The ERP system supports SOAP API and OData for bi-directional integration between Salesforce and the ERP system. UC has about one million opportunities. For each opportunity, UC sends 12 invoices, one per month. UC sales reps have requirements to view current invoice status and invoice amount from the opportunity page. When creating an object to model invoices, what should the architect recommend, considering performance and data storage space?. Create a custom object Invoice _c with a Lookup relationship with Opportunity. Create a custom object Invoice _c with a master -detail relationship with Opportunity. Use Streaming API to get the current status from the ERP and display on the Opportunity page. Create an external object Invoice _x with a Lookup relationship with Opportunity.

Universal Containers has a large number of Opportunity fields (100) that they want to track field history on. Which two actions should an architect perform in order to meet this requirement? (Choose 2 answers). Use Analytic Snapshots to store a copy of the record when changed. Select the 100 fields in the Opportunity Set History Tracking page. Create a custom object to store a copy of the record when changed. Create a custom object to store the previous and new field values.

In a Salesforce org used to manage Contacts, what two options should be considered to maintain data quality? (Choose 2 answers). Use workflow to delete duplicate records. Use validation rules on new record create and edit. Use the private sharing model. Use Salesforce duplicate management.

Universal Containers is looking to use Salesforce to manage their sales organization. They will be migrating legacy account data from two aging systems into Salesforce. Which two design considerations should an architect take to minimize data duplication? (Choose 2 answers). Clean data before importing to Salesforce. Import the data concurrently. Use Salesforce matching and duplicate rules. Use a workflow to check and prevent duplicates.

Managers at Universal Containers (UC) have noticed that shipment records (a custom object) are being sent to the shipping department with bad address data specifically, addresses have missing data like City and poorly formatted Postal codes. Which two approaches will solve this issue? (Choose 2 answers). Edit each of the page layouts to require that each address field contains data. Use a Validation Rule using CONTAINS to ensure address fields contain data. Use a Validation Rule using REGEX to ensure proper Postal Code formatting. Write an Apex Trigger to require all of the fields on the page layouts.

Universal Containers (UC) has multi-level account hierarchies that represent departments within their major Accounts. Users are creating duplicate Contacts across multiple departments. UC wants to clean the data so as to have a single Contact across departments. What two solutions should UC implement to cleanse their data? (Choose 2 answers). Make use of a third -party tool to help merge duplicate Contacts across Accounts. Use Workflow rules to standardize Contact information to identify and prevent duplicates. Use Data.com to standardize Contact address information to help identify duplicates. Make use of the Merge Contacts feature of Salesforce to merge duplicates for an Account.

Universal Containers has defined a new Data Quality Plan for their Salesforce data and wants to know how they can enforce it throughout the organization. Which two approaches should an architect recommend to enforce this new plan? (Choose 2 answers). Store all data in an external system and set up integration to Salesforce for view -only access. Schedule reports that will automatically catch duplicates and merge or delete the records every week. Use Workflow, Validation Rules, and Force.com code (Apex) to enforce critical business processes. Schedule a weekly dashboard displaying records that are missing information to be sent to managers for review.

A customer has an integration that creates records in a Salesforce Custom Object. The Custom Object has a field marked as required on the page layout. The customer has noticed that many of the records coming from the external system are missing data in this field. What two things should the architect do to ensure this field always contains data coming from the source system? (Choose 2 answers). Create a Workflow to default a value into this field. Blame the customer’s external system for bad data. Set up a Validation Rule to prevent blank values. Mark the field required in setup at the field level.

Universal Containers has two systems, Salesforce and an on-premise ERP system. An architect has been tasked with copying Opportunity records to the ERP once they reach a Closed/Won Stage. The Opportunity record in the ERP system will be read-only for all fields copied in from Salesforce. What is the optimal real-time approach that achieves this solution?. Have the ERP poll Salesforce nightly and bring in the desired Opportunities. Implement a workflow rule that sends Opportunity data through Outbound Messaging. Implement an hourly integration to send Salesforce Opportunities to the ERP system. Implement a Master Data Management system to determine system of record.

Universal Containers (UC) has three systems: Salesforce, a cloud -based ERP system, and an on -premise Order Management System (OMS). An architect has been tasked with creating a solution that uses Salesforce as the system of record for Leads and the OMS as the system of record for Account and Contacts. UC wants Accounts and Contacts to be able to maintain their names in each system (i.e., "John Doe" in the OMS and "Johnny Doe" in Salesforce), but wants to have a consolidated data store which links referenced records across the systems. What approach should an architect suggest so the requirements are met?. Use the Streaming API to send Account and Contact data from Salesforce to the OMS. Implement an integration tool to send OMS Accounts and Contacts to Salesforce. Implement a Master Data Management strategy to reconcile Leads, Accounts, and Contacts. Have Salesforce poll the OMS nightly and bring in the desired Accounts and Contacts.

A customer is integrating two different systems with customer records into the Salesforce Account object. Master Data Management will be used to ensure that no duplicate records are created in Salesforce. How can the architect determine which system is the system of record on a field level?. Master Data Management systems determine system of record and the architect doesn't have to think about what data is controlled by what system. Any Fields with the same purpose between the two systems should be reviewed by the key stakeholders to see how they will be used in Salesforce. Any field that is an input field in either external system will be overwritten by the last record integrated and can never have a system of record. Review the database schema for each external system and any fields with different names should always be separate fields in Salesforce.

In a disparate multi-system ERP environment, where Salesforce is being deployed, what are two techniques that should be used to maintain data synchronization between systems? (Choose 2 answers). Build synchronization reports and dashboards. Establish an MDM strategy to outline a single source of truth. Use workbench to update files within systems. Integrate Salesforce with the ERP environment.

All accounts and opportunities are created in Salesforce. Salesforce is integrated with three systems: • An ERP system feeds order data into Salesforce and updates both Account and Opportunity records. • An accounting system feeds invoice data into Salesforce and updates both Account and Opportunity records. • A commission system feeds commission data into Salesforce and updates both Account and Opportunity records. How should the architect determine which of these systems is the system of record?. Data flows should be reviewed with the business users to determine the system of record per object or field. Whatever integration data flow runs last will, by default, determine which system is the system of record. Account and opportunity data originates in Salesforce, and therefore Salesforce is the system of record. Whatever system updates the attribute or object should be the system of record for that field or object.

What are two valid metadata types that should be included to document the data architecture of a Salesforce org? (Choose 2 answers). Document. Record Type. Security Settings. Custom Field.

What are two key artifacts used to document the data architecture for a multi-system enterprise Salesforce implementation? (Choose 2 answers). Integration specification. Non-functional requirements. Data model. User stories.

As part of a phased Salesforce rollout there will be 3 deployments spread out over the year. The requirements have been carefully documented. Which two methods should an architect use to trace back configuration changes to the detailed requirements? (Choose 2 answers). Review the setup audit trail for configuration changes. Maintain a data dictionary with the justification for each field. Put the business purpose in the Description of each field. Use the Force.com IDE to save the metadata files in source control.

How can an architect find information about who is creating, changing, or deleting certain fields within the past two months?. Export the setup audit trail and find the fields in question. Remove "customize application" permissions from everyone else. Create a field history report for the fields in question. Export the metadata and search it for the fields in question.

Universal Containers wants to automatically archive all inactive Account data that is older than 3 years. The information does not need to remain accessible within the application. Which two methods should be recommended to meet this requirement? (Choose 2 answers). Schedule jobs to export and delete using the Data Loader. Schedule a weekly export file from the Salesforce UI. Use the Force.com Workbench to export the data. Schedule jobs to export and delete using an ETL tool.

Which two automated approaches should an architect recommend to purge old data out of Salesforce and aggregate that data in Salesforce? (Choose 2 answers). Third -party Business Intelligence system. Apex Triggers. Schedulable Batch Apex. Third -party Integration Tool (ETL).

Universal Containers (UC) is concerned that data is being corrupted daily either through negligence or maliciousness. They want to implement a backup strategy to help recover any corrupted data or data mistakenly changed or even deleted. What should the data architect consider when designing a field -level audit and recovery plan?. Review projected data storage needs. Schedule a weekly export file. Reduce data storage by purging old data. Implement an AppExchange package.

A Salesforce customer has millions of orders every year. Each order contains, on average, ten line items. The customer wants its Sales Reps to know how much money each customer generates year-over-year, but they are running out of data storage in Salesforce. What data archiving plan should the architect recommend?. Annually delete orders and order line items and ensure the customer has order information in another system. Annually aggregate order amount data to store in a custom object then delete those orders and order line items. Annually export and delete order line items and store them in a zip file in case they data is needed later. Annually export and delete orders and order line items and store them in a zip file in case the data is needed later.

A customer monitors over 10,000 servers and these servers automatically record their status every 15 minutes. The customer is required to maintain all of these status reports for a period of 10 years. Service Reps need access to up to one week's worth of these status reports with all of their details. What are two limits an architect should consider when recommending what data should be integrated into Salesforce and for how long it should be stored in Salesforce? (Choose 2 answers). Data storage limits. Webservice callout limits. API Request limits. Workflow rule limits.

A Salesforce customer has plenty of data storage. Sales Reps are complaining that searches are bringing back old records that aren't relevant any longer. Sales Managers need the data for their historical reporting. What strategy should a data architect use to ensure a better user experience for the Sales Reps?. Use Batch Apex to archive old data on a rolling nightly basis. Create a Permission Set to hide old data from Sales Reps. Archive and purge old data from Salesforce on a monthly basis. Set data access to Private to hide old data from Sales Reps.

Universal Containers (UC) is implementing a formal, cross-business unit data governance program. As part of the program, UC will implement a team to make decisions on enterprise-wide data governance. Which two roles are appropriate as members of this team? (Choose 2 answers). Operational Data Users. Data Domain Stewards. Analytics/BI Owners. Salesforce Administrators.

Universal Containers (UC) has a complex system landscape and is implementing a data governance program for the first time. Which two first steps would be appropriate for UC to initiate an assessment of data architecture? (Choose 2 answers). Engage with executive sponsorship to assess enterprise data strategy and goals. Engage with business units and IT to assess current operational systems and data models. Engage with IT program managers to assess current velocity of projects in the pipeline. Engage with database administrators to assess current database performance metrics.

A data architect has been tasked with optimizing a data stewardship engagement for a Salesforce instance. Which three areas of Salesforce should the architect review before proposing any design recommendation? (Choose 3 answers). Run key reports to determine what fields should be required. Review the metadata xml files for redundant fields to consolidate. Determine if any integration points create records in Salesforce. Export the setup audit trail to review what fields are being used. Review the sharing model to determine impact on duplicate records.

To avoid creating duplicate Contacts, a customer frequently uses Data Loader to upsert Contact records into Salesforce. What common error should the data architect be aware of when using upsert?. Errors when a duplicate Contact name is found cause upsert to fail. Errors with duplicate external Id values within the same CSV file. Errors with records being updated and inserted in the same CSV file. Errors with using the wrong external Id will cause the load to fail.

Universal Containers has deployed Salesforce for case management. The company is having difficulty understanding what percentage of cases are resolved from the initial call to their support organization. What first step is recommended to implement a reporting solution to measure the support reps case closure rates?. Enable field history tracking on the Case object. Create a report on Case analytic snapshots. Create Contact and Opportunity Reports and Dashboards. Install AppExchange packages for available reports.

In their legacy system Universal Containers has a monthly accounts receivable report that compiles data from Accounts, Contacts, Opportunities, Orders and Order Line Items. What difficulty will an architect run into when implementing this in Salesforce?. A report cannot contain data from Accounts and Contacts. Salesforce allows up to four objects in a single report type. Salesforce does not support Orders or Order Line Items. Custom report types cannot contain Opportunity data.

Universal Containers keeps its Account data in Salesforce and its Invoice data in a third -party ERP system. They have connected the Invoice data through a Salesforce external object. They want data from both Accounts and Invoices visible in one report in one place. What two approaches should an architect suggest for achieving this solution? (Choose 2 answers). Create a report combining data from the Account standard object and the Invoices external object. Create a Visualforce page combining Salesforce Account data and Invoice external object data. Create a report in an external system combining Salesforce Account data and Invoice data from the ERP. Create a separate Salesforce report for Accounts and Invoices and combine them in a dashboard.

Universal Containers wishes to maintain Lead data from Leads even after they are deleted and cleared from the Recycle Bin. What approach should be implemented to achieve this solution?. Query Salesforce with the queryAll API method or using the ALL ROWS SOQL keywords. Use a Lead standard report and filter on the IsDeleted standard field. Send data to a Data Warehouse and mark Leads as deleted in that system. Use a Converted Lead report to display data on Leads that have been deleted.

Universal Containers (UC) has deployed Salesforce to manage Marketing, Sales, and Support efforts in a multi-system ERP environment, after reaching the limits of native reports & dashboards. UC leadership is looking to understand what options can be used to provide more analytical insights. What two approaches should an architect recommend? (Choose 2 answers). AppExchange Apps. Weekly Snapshots. Wave Analytics. Setup Audit Trails.

Universal Containers is setting up an external Business Intelligence (BI) system and wants to extract 1,000,000 Contact records. What should be recommended to avoid timeouts during the export process?. Utilize the Bulk API to export the data. Use the SOAP API to export data. Use GZIP compression to export the data. Schedule a Batch Apex job to export the data.

Universal Containers (UC) is a business that works directly with individual consumers (B2C). They are moving from a current home-grown CRM system to Salesforce. UC has about one million consumer records. What should the architect recommend for optimal use of Salesforce functionality and also to avoid data loading issues?. Create a Custom Object Individual Consumer c to load all individual consumers. Load all individual consumers as Account records and avoid using the Contact object. Load one Account record and one Contact record for each individual consumer. Create one Account and load individual consumers as Contacts linked to that one Account.

Universal Containers (UC) has a data model as shown in the image. The Project object has a private sharing model, and it has Roll -Up summary fields to calculate the number of resources assigned to the project, total hours for the project, and the number of work items associated to the project. What should the architect consider, knowing there will be a large amount of time entry records to be loaded regularly from an external system into Salesforce.com?. Use triggers to calculate summary values instead of Roll -Up. Load all data using external IDs to link to parent records. Use workflow to calculate summary values instead of Roll -Up. Load all data after deferring sharing calculations.

Universal Containers is migrating their legacy system's users and data to Salesforce. They will be creating 10,000 users, 2 million Account records, and 10 million Invoice records. The visibility of these records is controlled by a few dozen owner and criteria -based sharing rules. What are two approaches that will minimize data loading time during this migration to a new organization? (Choose 2 answers). First, load all account records, and then load all user records. Defer sharing calculations until the data has finished uploading. Create the users, upload all data, and then deploy the sharing rules. Contact Salesforce to activate indexing before uploading the data.

Universal Containers wishes to send data from Salesforce to an external system to generate invoices from their Order Management System (OMS). They want a Salesforce administrator to be able to customize which fields will be sent to the external system without modifying code. What two approaches should an architect recommend to deliver the desired solution? (Choose 2 answers). An Outbound Message to determine which fields to send to the OMS. A set<sobjectFieldset> to determine which fields to send in an HTTP callout. A Field set that determines which fields to send in an HTTP callout. Enable the field -level security permissions for the fields to send.

The architect is planning a large data migration for Universal Containers from their legacy CRM system to Salesforce. What three things should the architect consider to optimize performance of the data migration? (Choose 3 answers). Deactivate approval processes and workflow rules. Remove custom indexes on the data being loaded. Review the time zones of the User loading the data. Defer sharing calculations of the Salesforce Org. Determine if the legacy system is still in use.

Universal Containers has a large volume of Contact data going into Salesforce.com. There are 100,000 existing contact records. 200,000 new contacts will be loaded. The Contact object has an external ID field that is unique and must be populated for all existing records. What should the architect recommend to reduce data load processing time?. Load new records via the Insert operation and existing records via the Update operation. Load Contact records together using the Streaming API via the Upsert operation. Load all records via the Upsert operation to determine new records vs. existing records. Delete all existing records, and then load all records together via the Insert operation.

An architect is planning on having different batches to load one million Opportunities into Salesforce using the Bulk API in parallel mode. What should be considered when loading the Opportunity records?. Create indexes on Opportunity object text fields. Sort batches by Name field values. Group batches by the AccountId field. Order batches by Auto -number field.

Universal Containers has more than 10 million records in the Order_c object. The query has timed out when running a bulk query. What should be considered to resolve query timeout?. Streaming API. PK Chunking. Metadata API. Tooling API.

Global Containers (GC) just acquired Universal Containers (UC). Both companies use Salesforce and as part of the acquisition, all of the data for the UC Salesforce instance (source) must be migrated into the GC Salesforce instance (target). Universal Containers has over 5 million Case records. What should the architect consider when trying to optimize the data load time?. Load Case data directly leveraging Salesforce-to-Salesforce functionality. Break the load into multiple sets of data to be loaded using Bulk API parallel processes. Use the Salesforce Org Migration Tool from the Setup Data Management menu. Pre-process the data, then use Data Loader with SOAP API to upsert with zip compression enabled.

Universal Containers (UC) has users complaining about reports timing out or simply taking too long to run What two actions should the data architect recommend to improve the reporting experience? (Choose 2 answers). Index key fields used in report criteria. Enable Divisions for large data objects. Share each report with fewer users. Create one skinny table per report.

Universal Containers (UC) has over 10 million records. They have a nightly integration that queries these records. The queries are timing out. What should the data architect do or look for when troubleshooting the queries? (Choose 2 answers). Change the integration users' profile to have View All Data. Create a formula field instead of having multiple filter criteria. Create custom indexes on the fields used in the filter criteria. Ensure the query doesn't contain NULL in any filter criteria.

Universal Containers (UC) is implementing a Salesforce project with large volumes of data and daily transactions. The solution includes both real-time web service integrations and Visualforce mash-ups with back-end systems. The Salesforce Full sandbox used by the project integrates with full-scale back-end testing systems. What two types of performance testing are appropriate for this project? (Choose 2 answers). Stress testing against the web services hosted by the integration middleware. Pre -go-live automated page -load testing against the Salesforce Full sandbox. Pre -go -live unit testing in the Salesforce Full sandbox. Post go -live automated page -load testing against the Salesforce Production org.

Denunciar Test