Our pass rate is high to 98.9% and the similarity percentage between our 70-767 study guide and real exam is 90% based on our seven-year educating experience. Do you want achievements in the Microsoft 70-767 exam in just one try? I am currently studying for the Microsoft 70-767 exam. Latest Microsoft 70-767 Test exam practice questions and answers, Try Microsoft 70-767 Brain Dumps First.

Q11. You are developing a project that contains multiple SQL Server Integration Services (SSIS) packages. The packages will be deployed to the SSIS catalog. One of the steps in each package accesses an FTP site to download data files.

You create project parameters to store the username and password that are used to access the FTP site.

You need to ensure that the username and password values are encrypted when they are deployed.

What should you do?

A. Convert the parameters to package parameters.

B. Set the Sensitive property of the parameters to True.

C. Set the ProtectionLevel property of the package to EncryptSensitiveWithPassword.

D. Convert the project to the Legacy Deployment model.

Answer: B


Q12. You administer a Microsoft SQL Server 2016 server that has SQL Server Integration Services (SSIS) installed.

You plan to deploy new SSIS packages to the server. The SSIS packages use the Project Deployment Model together with parameters and Integration Services environment variables.

You need to configure the SQL Server environment to support these packages. What should you do?

A. Create SSIS configuration files for the packages.

B. Create an Integration Services catalog.

C. Install Data Quality Services.

D. Install Master Data services.

Answer: B

Explanation: 

Reference:

http://msdn.microsoft.com/en-us/library/hh479588.aspx http://msdn.microsoft.com/en-us/library/hh213290.aspx http://msdn.microsoft.com/en-us/library/hh213373.aspx


Q13. You are developing a SQL Server Integration Services (SSIS) package that imports data into a data warehouse.

You add an Execute SQL task to the control flow. The task must execute a simple INSERT statement.

The task has the following requirements:

•The INSERT statement must use the value of a string package variable. The variable name is StringVar.

•The Execute SQL task must use an OLE DB Connection Manager.

In the Parameter Mapping tab of the Execute SQL task, StringVar has been added as the only parameter.

You must configure the SQLStatement property of the Execute SQL task. Which SQL statement should you use?

A. INSERT INTO dbo.Table (variablevalue) VALUES (@StringVar)

B. INSERT INTO dbo.Table (variablevalue) VALUES ($Project::StringVar)

C. INSERT INTO dbo.Table (variablevalue) VALUES (?)

D. INSERT INTO dbo.Table (variablevalue) VALUES ($Package::StringVar)

Answer: C


Q14. You administer a SQL Server Integration Services (SSIS) solution in the SSIS catalog. A SQL Server Agent job is used to execute a package daily with the basic logging level.

Recently, the package execution failed because of a primary key violation when the package inserted data into the destination table.

You need to identify all previous times that the package execution failed because of a primary key violation.

What should you do?

A. Use an event handler for OnError for the package.

B. Use an event handler for OnError for each data flow task.

C. Use an event handler for OnTaskFailed for the package.

D. View the job history for the SQL Server Agent job.

E. View the All Messages subsection of the All Executions report for the package.

F. Store the System::SourceID variable in the custom log table.

G. Store the System::ServerExecutionID variable in the custom log table.

H. Store the System::ExecutionInstanceGUID variable in the custom log table.

I. Enable the SSIS log provider for SQL Server for OnError in the package control flow.

J. Enable the SSIS log provider for SQL Server for OnTaskFailed in the package control flow.

K. Deploy the project by using dtutil.exe with the /COPY DTS option.

L. Deploy the project by using dtutil.exe with the /COPY SQL option.

M. Deploy the .ispac file by using the Integration Services Deployment Wizard.

N. Create a SQL Server Agent job to execute the SSISDB.catalog.validate_project stored procedure.

O. Create a SQL Server Agent job to execute the SSISDB.catalog.validate_package stored procedure.

P. Create a SQL Server Agent job to execute the

SSISDB.catalog.create_execution and SSISDB.catalog.start_execution stored procedures.

Q. Create a table to store error information. Create an error output on each data flow destination that writes OnError event text to the table.

R. Create a table to store error information. Create an error output on each data flow destination that writes OnTaskFailed event text to the table.

Answer: E


Q15. To ease the debugging of packages, you standardize the SQL Server Integration Services (SSIS) package logging methodology.

The methodology has the following requirements:

•Centralized logging in SQL Server

•Simple deployment

•Availability of log information through reports or T-SQL

•Automatic purge of older log entries

•Configurable log details

You need to configure a logging methodology that meets the requirements while minimizing the amount of deployment and development effort.

What should you do?

A. Deploy the package by using an msi file.

B. Use the gacutil command.

C. Create an OnError event handler.

D. Create a reusable custom logging component.

E. Use the dtutil /copy command.

F. Use the Project Deployment Wizard.

G. Run the package by using the dtexec /rep /conn command.

H. Add a data tap on the output of a component in the package data flow.

I. Run the package by using the dtexec /dumperror /conn command.

J. Run the package by using the dtexecui.exe utility and the SQL Log provider.

K. Deploy the package to the Integration Services catalog by using dtutil and use SQL Server to store the configuration.

Answer:

Explanation: References:

http://msdn.microsoft.com/en-us/library/ms140246.aspx http://msdn.microsoft.com/en-us/library/ms180378(v=sql.110).aspx


Q16. You are designing a partitioning strategy for a large fact table in a data warehouse. Tens of millions of new records are loaded into the data warehouse weekly, outside of business hours.

Most queries are generated by reports and by cube processing. Data is frequently queried at the day level and occasionally at the month level.

You need to partition the table to maximize the performance of queries. What should you do? (More than one answer choice may achieve the goal. Select the BEST answer.)

A. Partition the fact table by month, and compress each partition.

B. Partition the fact table by week.

C. Partition the fact table by year.

D. Partition the fact table by day, and compress each partition.

Answer: D


Q17. You develop a SQL Server Integration Services (SSIS) package that imports SQL Azure data into a data warehouse every night.

The SQL Azure data contains many misspellings and variations of abbreviations. To import the data, a developer used the Fuzzy Lookup transformation to choose the closest- matching string from a reference table of allowed values. The number of rows in the reference table is very large.

If no acceptable match is found, the Fuzzy Lookup transformation passes a null value.

The current setting for the Fuzzy Lookup similarity threshold is 0.50. Many values are incorrectly matched.

You need to ensure that more accurate matches are made by the Fuzzy Lookup transformation without degrading performance.

What should you do?

A. Change the Exhaustive property to True.

B. Change the similarity threshold to 0.55.

C. Change the similarity threshold to 0.40.

D. Increase the maximum number of matches per lookup.

Answer: B

Explanation:

http://msdn.microsoft.com/en-us/library/ms137786.aspx


Q18. You are developing a SQL Server Integration Services (SSIS) project to read and write data from a Windows Azure SQL Database database to a server that runs SQL Server 2016.

The connection will be used by data flow tasks in multiple SSIS packages. The address of the target Windows Azure SQL Database database will be provided by a project parameter.

You need to create a solution to meet the requirements by using the least amount of administrative effort and maximizing data flow performance.

What should you do?

A. Use an SSIS Script task that uses the custom assembly to parse the text data when inserting it.

B. Use an SSIS Script transformation that uses the custom assembly to parse the text data when inserting it.

C. Create a SQL Common Language Runtime (SQLCLR) function that uses the custom assembly to parse the text data, deploy it in the Windows Azure SQL Database database, and use it when inserting data.

D. Create a SQL Common Language Runtime (SQLCLR) stored procedure that uses the custom assembly to parse the text data, deploy it in the Windows Azure SQL Database database, and use it when inserting data.

Answer: A


Q19. You are designing a data warehouse for a software distribution business that stores sales by software title. It stores sales targets by software category. Software titles are classified into subcategories and categories. Each software title is included in only a single software subcategory, and each subcategory is included in only a single category. The data warehouse will be a data source for an Analysis Services cube.

The data warehouse contains two fact tables:

✑ factSales, used to record daily sales by software title

✑ factTarget, used to record the monthly sales targets by software category

Reports must be developed against the warehouse that reports sales by software title, category and subcategory, and sales targets.

You need to design the software title dimension. The solution should use as few tables as possible while supporting all the requirements.

What should you do?

A. Create three software tables, dimSoftware, dimSoftwareCategory, and dimSoftwareSubcategory and a fourth bridge table that joins software titles to their appropriate category and subcategory table records with foreign key constraints. Direct the cube developer to use key granularity attributes.

B. Create three software tables, dimSoftware, dimSoftwareCategory, and dimSoftwareSubcategory. Connect factSales to all three tables and connect factTarget to dimSoftwareCategory with foreign key constraints. Direct the cube developer to use key granularity attributes.

C. Create one table, dimSoftware, which contains Software Detail, Category, and Subcategory columns. Connect factSales to dimSoftware with a foreign key constraint. Direct the cube developer to use a non-key granularity attribute for factTarget.

D. Create two tables, dimSoftware and dimSoftwareCategory. Connect factSales to dimSoftware and factTarget to dimSoftwareCategory with foreign key constraints. Direct the cube developer to use key granularity attributes.

Answer: C


Q20. You are designing a data warehouse with two fact tables. The first table contains sales per month and the second table contains orders per day.

Referential integrity must be enforced declaratively.

You need to design a solution that can join a single time dimension to both fact tables.

What should you do?

A. Join the two fact tables.

B. Merge the fact tables.

C. Create a time dimension that can join to both fact tables at their respective granularity.

D. Create a surrogate key for the time dimension.

Answer: C

Explanation: References: http://msdn.microsoft.com/en-us/library/ms174537.aspx

http://technet.microsoft.com/en-us/library/ms174832.aspx http://msdn.microsoft.com/en-us/library/ms174884.aspx http://decipherinfosys.wordpress.com/2007/02/01/surrogate-keys-vs-natural-keys-for- primary-key/

http://www.agiledata.org/essays/keys.html http://www.databasejournal.com/features/mssql/article.php/3922066/SQL-Server-Natural-

Key-Verses-Surrogate-Key.htm http://www.jamesserra.com/archive/2016/01/surrogate-keys/