I would recommend that you don't do that.
DELETE rows where the [tbl_Data] PKey can no longer be found in [tbl_Data_Load]
UPDATE any rows where the PKey already exists. Do not update unless at least one column has changed (that will fire any triggers, which increases CPU/work at best and at worst will cause associated tasks to run - e.g. sending a "Record has changed" email when ... there is actually no change!). Use a BINARY Collation for (CHAR) column comparisons to ensure that case-insensitive differences (like "'a' changed to 'A') are treated as significant.
If the Source data has an Update Date/Time column (or something else which is a reliable indicator of a change, such as Row Version or an incremented EditCount etc), which is reliable, use that to reduce the number of rows that are considered for updating.
INSERT any rows where the PKey does not already exist
Doing the three steps in that order should reduce the effort for UPDATE somewhat.
You can also use the MERGE command instead of three separate steps, but personally I find the syntax of the MERGE unwieldy and harder to get right-first-time. Its probably just me though ...
Limiting the UPDATE (.e.g by NOT pre-deleting all rows, and not updating rows that have not changed) will make a significantly (probably hugely significant!) difference to the volume of data recorded in the Transaction Log. That will save both Disk and CPU time.