cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

This product reached the end of support date on March 31, 2021.

Bulk insert common issues

przemek_tafelsk
Inactive

Background



Bulk insert method is used to provide a more efficient way of storing data in MS SQL database when CAS is processing AMD files (zdata, etc).



There is no special configuration needed to enable this mode when CAS and MS SQL Server which hosts CAS database reside on the same machine.



However in more than 50% of DC-RUM installations, customers use remote MS SQL Servers to host CAS database - either dedicated SQL servers or shared SQL Server (cluster). In case of remote SQL server, the bulk insert mode requires manual configuration.




Bulk insert mode requirements



1) a shared folder on MS SQL Server accessible for read+write by CAS (system service)



2) CAS (system service) must be run with a domain user privileges (vs "local account" by default on Windows) in order for CAS to be able to access the shared folder (1) on MS SQL Server




The how-to procedure is described in DC-RUM Admin guide -> Chapter 19 : Advanced Configuration -> Configuring Bulk Insert




When bulk insert mode is configured properly, when CAS is restarted, the server.log will contain the following:



T HLOG    13-06-19 14:36:52.114    Bulk insert test passed: all of 805 records recovered



Known issues



















ID

Issue

Possible resolution

1

Despite proper configuration and initial bulk insert test passed, there are bulk insert errors reported in server.log during data processing, for example.

Check if some antivirus software runs on SQL server. If so, add the bulk insert folder (shared folder on SQL server for CAS) to EXCLUSIONS list for this antivirus, so antivirus won't scan new files created here by CAS all the times and won't lock them up allowing CAS to delete t he files one SQL is done with importing data from them)

2

Bulk insert mode is not enabled on remote SQL server in DC-RUM release 12.1 GA despite proper configuration

Due to a bug in bulk insert mode tester that is run at CAS start-up, bulk insert mode is not enabled, despite the correct settings on CAS & SQL Server end. Even if bulk insert mode worked just fine before migration, right after CAS v12.1 start it won't work any more. Data processing continues in regular (prepared inserts mode) and that is much slower than bulk insert way.



There is a patch to this issue available for CAS 12.1 GA.

Unzip it first, then load it via /Patch assistant on CAS and restart CAS from Patch Assistant page.


18 REPLIES 18

"Known issues" ID 2: the patch link for CAS 12.1 GA under "Possible resolution" is not working.



Could you please update the hyperlink provided? We're planning on upgrading a client and would like to be prepared, in case we experience this issue - thanks!

André - send me your email address and I'll send it to you.

U know my email (smile)

Try it now.

Ulf, mailed you (smile)



Thanks Przemek, link is working now!

jose_baeza
Participant

Good afternoon

Bulk insert to the method already implemented VAS 12.1
in SQL Server ( Failover Cluster ) containing 4 nodes
configure access shared folder on node1 and the method works without problems .
When you perform the same steps on any of the 3 remaining nodes sends me the message

The network folder specified is currently mapped using a diffent user name and password to connect using a different user name and password , first disconnet any existing mappings to this network share

to verify the explorer networks or available units , there is no access to any shared drive.
try by assigning different letters to the shared drive and got the same message.

CAS DB when moving to a different node to node 1 get the message
data processing was stopped due to
BULK INSERT failures , delay in data processing performance in twentieth minute ( s ) , data processing is being performed in the debug mode
when you return to the DB node 1 CAS the bulk insert works correctly
any suggestions
thanks

jose.baeza@hova-it.net : You should rather  contact with Microsoft support since this sounds like unrelated to Compuware Software.  Try the solution here : http://support.microsoft.com/kb/938120 Maybe it will help.

 

Good luck!

Senior Product Manager,
Dynatrace Managed expert

dennis_mispelba
Inactive

I have a question concerning the documentation which is referenced here as well (Admin Guide). This is affecting 12.2 and 12.3 documentation.

In the "Configuring Bulk insert" section it is stated in step 3 to:

c. Set the Value for RtmJob.bulk.write to \\SQL_server_IP_address\bulkinsert$\

d. Set the Value for RtmJob.bulk.read to D:\bulkinsert\

Now, I don't understand "d" that read should be set to D:\bulkinsert\ since that folder to write to is on the (remote) SQL server (since we are here talking about bulk insert configuration for a setup where the MSSQL is not on the same machine as the CAS). Shouldn't it be for both properties to set it to the shared folder, which is physically saved on the MSSQL server? I did not create any local bulkinsert folder (from a CAS perspective) before in the steps, so I am struggling with point "d", or is it a mistake in documentation? Maybe anyone could shed some light.

Best regards,
Dennis

Dennis,

Docs are OK in this matter, otherwise we would have thousands of slow CASes all over the world (smile)

The bulkinsert shared directory is located on BD server. Both properties point to the same folder but from two different perspectives:RtmJob.bulk.write property is what CAS sees and RtmJob.bulk.read is what DB server sees. If the DB is local then such configuration is valid too, however sharing is not necessary and both paths could be local.

 

gregg_kniss
Guide

I believe the CAS is going to write remotely and the SQL is going to read locally. This can also be done in opposite configuration where the SQL account reads remotely from the CAS.

dennis_mispelba
Inactive

Ok, thanks for the insight.

Now, I have a customer, who actually did it the opposite way as indicated by Gregg:

  1. RtmJob.bulk.write = D:\Program Files\Compuware\CAS_Bulk\
  2. RtmJob.bulk.read = \\CAS_IP_ADDRESS\CAS_Bulk\

They actually did it with former consultancy of a dynatrace consultant. Now they moved the DB (and we changed configuration setting in repository.properties), but bulk insert is not working anymore. So I was trying to get my head around this configuration. Actually, given your explanation, you would not need to change anything with this configuration? As the shared folder is on the CAS instead of the MSSQL server (although this is not the approach we document). :S

Maybe their DB user is missing the right to read from the shared folder located on the CAS...This is what they currently check.

Hi Dennis, that may likely be the case. At least on SQL/Server 2012 you need to set additional permissions on the shared folder on the database server.
Give NT SERVICE\MSSQLSERVER permissions on the folder. (Or NT SERVICE\MSSQL$InstanceName for a named instance)

Thanks, Frans.

In the meantime we were able to solve the issue (their was also some misunderstanding involved).
The DBA was asking me, if he now can safely activate the functions of MSSQL 2012 (so that really the new features are used, I think at the moment it is some kind of compatibility mode it is running in, to behave like a older MSSQL). I guess, I can tell him to go ahead, can't I? Do you know?

I can't really tell Dennis. I do not know what the "the functions of MSSQL 2012"  are. (smile)
I know it runs fine on MSSQL 2012, and CAS is less picky then BSM. My hunge is that it should work fine, but that is far from scientific evidence.

For sure I would first test this on a Test server, or have Support saying something about this.

 

Support gave his ok (smile) This compatibility level is referred to https://msdn.microsoft.com/en-us/library/bb510680.aspx. So this should be kept in mind when updated/migrate/move to a new MSSQL DB version. The related ticket can be found here: SUPDCRUM-6949 (I cannot link the JIRA as there will be an unexpected  error for the macro)

Thank you for the reference, and usefull information. This community works (wink)
The JIRA# is not 'public', but is stored as reference, though.

gregg_kniss
Guide

It doesn't seem like anything would need to change.

 

Definitely check both folder and share permissions for bulkinsert and ensure that the current account running SQL has access.

Well, working on a new CAS 12.3 installation, and a remote DB server - also under our control - the Bulk Insert did not work right away.

Environment: Miicrosoft erver 2012 and Windows SQL Server 2012,

Set up "by the book", however it failed according to the log because the file "D:\BulkInsert$\BulkInsertTest_0.txt" could not be opened. Operating system error code 5(Access is denied.). This message gives an indication that on de DB server side the file was not read due to restrictions.

Did some investigations on SQL 2012 and file and directory permissions. As it turns out on the Database server you have to give the (virtual) SQLSERVER service account also access to BulkInsert folder!

Open the security tab of the folder through Properties and give NT SERVICE\MSSQLSERVER permissions on the folder. (Or NT SERVICE\MSSQL$InstanceName for a named instance)

References: https://msdn.microsoft.com/en-us/library/jj219062.aspx
http://dba.stackexchange.com/questions/30668/how-do-i-configure-sql-server-2012-so-that-it-can-resto...

Failing Log Part:

DB server assumed to be remote

Bulk Inserter folders, for write: \\10.11.12.13\BulkInsert$\, for read: D:\BulkInsert$\. Table-level locking: true

Adaptive Inserter:  refresh interval set to 20 minutes

BulkInsertTest Bulk inserter failed on table BulkInsertTest (D:\BulkInsert$\BulkInsertTest_0.txt). Reason :Cannot bulk load because the file "D:\BulkInsert$\BulkInsertTest_0.txt" could not be opened. Operating system error code 5(Access is denied.).

Bulk insert test for data source default failed: stored 0 records instead of 805. All inserts will be made by PreparedInserter

Adaptive Inserter:  Implementation for data source 'default' initially chosen to Prepared Inserter

In order to configure Bulk Inserter you must:

1) set 'bulk.enabled' property in UserProperties table

2) set 'bulk.write' and 'bulk.read' properites in UserProperties table

   or 'RtmJob.bulk.write' , 'RtmJob.bulk.read' in RtmProps table

3) specify user which has access to write directory on remote server (log on tab in services)

Current settings are:

bulk enabled :true

write directory :\\10.11.12.13\BulkInsert$\

read directory :D:\BulkInsert$\

user :srvcCASReportSerer

 

Succesfull Log part:

DB server assumed to be remote

Bulk Inserter folders, for write: \\10.11.12.13\BulkInsert$\, for read: D:\BulkInsert$\. Table-level locking: true

Adaptive Inserter:  refresh interval set to 20 minutes

Bulk insert test passed: all of 829 records recovered

Adaptive Inserter:  Implementation for data source 'default' initially chosen to Bulk Inserter

 

 

Also, on  MS Server 2012 you create the folder BulkInsert$ (e.g. D:\BulkInsert$) and share that folder.
It may not be an option to share D:\BulkInsert folder, as BulkInsert$ in contradiction to what the instructions tell.
Configuring Bulk Insert