Weekend Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: bigdisc65

Data-Architect Exam Dumps - Salesforce Data-Architect Questions and Answers

Question # 54

Universal Containers has been a customer of Salesforce for 10 years. Currently they have 2 million accounts in the system. Due to an erroneous integration built 3 years ago, it is estimated there are 500,000 duplicates in the system.

Which solution should a data architect recommend to remediate the duplication issue?

Options:

A.

Develop an ETL process that utilizers the merge API to merge the duplicate records

B.

Utilize a data warehouse as the system of truth

C.

Extract the data using data loader and use excel to merge the duplicate records

D.

Implement duplicate rules

Buy Now
Question # 55

Universal Containers has successfully migrated 50 million records into five different objects multiple times in a full copy sandbox. The Integration Engineer wants to re-run the test again a month before it goes live into Production. What is the recommended approach to re-run the test?

Options:

A.

Truncate all 5 objects quickly and re-run the data migration test.

B.

Refresh the full copy sandbox and re-run the data migration test.

C.

Hard delete all 5 objects’ data and re-run the data migration test.

D.

Truncate all 5 objects and hard delete before running the migration test.

Buy Now
Question # 56

Universal Containers (UC) is a major supplier of office supplies. Some products are produced by UC and some by other manufacturers. Recently, a number of customers have complained that product descriptions on the invoices do not match the descriptions in the online catalog and on some of the order confirmations (e.g., "ballpoint pen" in the catalog and "pen" on the invoice, and item color labels are inconsistent: "what vs. "White" or "blk" vs. "Black"). All product data is consolidated in the company data warehouse and pushed to Salesforce to generate quotes and invoices. The online catalog and webshop is a Salesforce Customer Community solution. What is a correct technique UC should use to solve the data inconsistency?

Options:

A.

Change integration to let product master systems update product data directly in Salesforce via the Salesforce API.

B.

Add custom fields to the Product standard object in Salesforce to store data from the different source systems.

C.

Define a data taxonomy for product data and apply the taxonomy to the product data in the data warehouse.

D.

Build Apex Triggers in Salesforce that ensure products have the correct names and labels after data is loaded into salesforce.

Buy Now
Question # 57

Universal Containers (UC) loads bulk leads and campaigns from third-party lead aggregators on a weekly and monthly basis. The expected lead record volume is 500K records per week, and the expected campaign records volume is 10K campaigns per week. After the upload, Lead records are shared with various sales agents via sharing rules and added as Campaign members via Apex triggers on Lead creation. UC agents work on leads for 6 months, but want to keep the records in the system for at least 1 year for reference. Compliance requires them to be stored for a minimum of 3 years. After that, data can be deleted. What statement is true with respect to a data archiving strategy for UC?

Options:

A.

UC can store long-term lead records in custom storage objects to avoid counting against storage limits.

B.

UC can leverage the Salesforce Data Backup and Recovery feature for data archival needs.

C.

UC can leverage recycle bin capability, which guarantees record storage for 15 days after deletion.

D.

UC can leverage a “tier”-based approach to classify the record storage need.

Buy Now
Question # 58

Universal Containers is exporting 40 million Account records from Salesforce using Informatica Cloud. The ETL tool fails and the query log indicates a full table scan time-out failure. What is the recommended solution?

Options:

A.

Modify the export job header to specify Export-in-Parallel.

B.

Modify the export job header to specify Sforce-Enable-PKChunking.

C.

Modify the export query that includes standard index fields(s).

D.

Modify the export query with LIMIT clause with Batch size 10,000.

Buy Now
Question # 59

Universal Containers (UC) is implementing its new Internet of Things technology, which consists of smart containers that provide information on container temperature and humidity updated every 10 minutes back to UC. There are roughly 10,000 containers equipped with this technology with the number expected to increase to 50,000 across the next five years. It is essential that Salesforce user have access to current and historical temperature and humidity data for each container. What is the recommended solution?

Options:

A.

Create new custom fields for temperature and humidity in the existing Container custom object, as well as an external ID field that is unique for each container. These custom fields are updated when a new measure is received.

B.

Create a new Container Reading custom object, which is created when a new measure is received for a specific container. The Container Reading custom object has a master-detail relationship to the container object.

C.

Create a new Lightning Component that displays last humidity and temperature data for a specific container and can also display historical trends obtaining relevant data from UC’s existing data warehouse.

D.

Create a new Container Reading custom object with a master-detail relationship to Container which is created when a new measure is received for a specific container. Implement an archiving process that runs every hour.

Buy Now
Question # 60

Universal Containers wants to automatically archive all inactive Account data that is older than 3 years. The information does not need to remain accessible within the application. Which two methods should be recommended to meet this requirement? Choose 2 answers

Options:

A.

Use the Force.com Workbench to export the data.

B.

Schedule a weekly export file from the Salesforce UI.

C.

Schedule jobs to export and delete using an ETL tool.

D.

Schedule jobs to export and delete using the Data Loader.

Buy Now
Question # 61

An Architect needs to document the data architecture for a multi-system, enterprise Salesforce implementation.

Which two key artifacts should the Architect use? (Choose two.)

Options:

A.

User stories

B.

Data model

C.

Integration specification

D.

Non-functional requirements

Buy Now
Question # 62

Universal Containers (UC) is implementing Salesforce Sales Cloud and Service Cloud. As part of their implementation, they are planning to create a new custom object (Shipments), which will have a lookup relationship to Opportunities. When creating shipment records, Salesforce users need to manually input a customer reference, which is provided by customers, and will be stored in the Customer_Reference__c text custom field. Support agents will likely use this customer reference to search for Shipment records when resolving shipping issues. UC is expecting to have around 5 million shipment records created per year. What is the recommended solution to ensure that support agents using global search and reports can quickly find shipment records?

Options:

A.

Implement an archiving process for shipment records created after five years.

B.

Implement an archiving process for shipment records created after three years.

C.

Set Customer-Reference_c as an External ID (non-unique).

D.

Set Customer-Reference_c as an External ID (unique).

Buy Now
Question # 63

Universal Containers (UC) has a multi-level master-detail relationship for opportunities, a custom opportunity line item object, and a custom discount request. UC has opportunity as master and custom line item object as detail in master-detail relationship. UC also has a custom line item object as master and a custom discount request object as detail in another master-detail relationship. UC has a requirement to show all sums of discounts across line items at an opportunity level. What is the recommended solution to address these requirements?

Options:

A.

Use roll-up for the line-item-level summary and a trigger for the opportunity amount summary, as only one level roll-up is allowed.

B.

Update the master-detail relationships to lookup relationships in order to allow the discount amount to roll up.

C.

Remove the master-detail relationships and rely completely on workflow/triggers to summarize the discount amount.

D.

Roll-up discount request amount at the line-item-level and line-item-level summary discount at the opportunity level.

Buy Now
Exam Code: Data-Architect
Exam Name: Salesforce Certified Data Architect (SP25)
Last Update: Jul 5, 2025
Questions: 257
Data-Architect pdf

Data-Architect PDF

$29.75  $84.99
Data-Architect Engine

Data-Architect Testing Engine

$33.25  $94.99
Data-Architect PDF + Engine

Data-Architect PDF + Testing Engine

$47.25  $134.99