Showing posts with label sp4howdy. Show all posts
Showing posts with label sp4howdy. Show all posts

Sunday, February 12, 2012

Alternative to OSQL?

SQL2K SP4
Howdy all. I have a daily process that gets a .txt file full of Inserts/
Updates/ Deletes and imports them into a SQL DB every day. I have a Stored
Proc that calls a .bat which contains OSQL and runs the .txt file. Are there
any alternatives to this from within SQL Server?
TIA, ChrisRHi Chris
You could write your own application to parse the file and run each command,
a different alternative would be to load the data into a table and use the
EXEC command to run them. As your statements are in a file, there is a
security risk. If you just provided a data file and loaded it into a staging
table you could then insert/update existing data quite easily on-mass withou
t
having to insert/update each row individually. You could use DTS, BULK INSER
T
or BCP to quickly load the data into a staging table.
John
"ChrisR" wrote:

> SQL2K SP4
> Howdy all. I have a daily process that gets a .txt file full of Inserts/
> Updates/ Deletes and imports them into a SQL DB every day. I have a Stored
> Proc that calls a .bat which contains OSQL and runs the .txt file. Are the
re
> any alternatives to this from within SQL Server?
> TIA, ChrisR
>
>|||I appreciare your ideas, but Im confused.

> You could write your own application to parse the file and run each
command
I really need to do this from within SQL Server.

> If you just provided a data file and loaded it into a staging
> table you could then insert/update existing data quite easily on-mass
without
> having to insert/update each row individually. You could use DTS, BULK
INSERT
> or BCP to quickly load the data into a staging table.
How would this be any better than what I have? I still need to get it from
the file into a table. The way Im reading this, I need to go from file to
table, then from table to table?
"John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:C37FBE46-03DF-4FF8-9F53-35532DECCE64@.microsoft.com...
> Hi Chris
> You could write your own application to parse the file and run each
command,
> a different alternative would be to load the data into a table and use the
> EXEC command to run them. As your statements are in a file, there is a
> security risk. If you just provided a data file and loaded it into a
staging
> table you could then insert/update existing data quite easily on-mass
without
> having to insert/update each row individually. You could use DTS, BULK
INSERT[vbcol=seagreen]
> or BCP to quickly load the data into a staging table.
> John
> "ChrisR" wrote:
>
Stored[vbcol=seagreen]
there[vbcol=seagreen]|||Hi Chris
How do you create the file that makes up these SQL Statements?
John
"ChrisR" wrote:

> I appreciare your ideas, but Im confused.
>
> command
> I really need to do this from within SQL Server.
>
> without
> INSERT
> How would this be any better than what I have? I still need to get it from
> the file into a table. The way Im reading this, I need to go from file to
> table, then from table to table?
>
> "John Bell" <jbellnewsposts@.hotmail.com> wrote in message
> news:C37FBE46-03DF-4FF8-9F53-35532DECCE64@.microsoft.com...
> command,
> staging
> without
> INSERT
> Stored
> there
>
>|||It is made up for me. A mainframe does some stuff, an ETL tool called Tree
House does some stuff, etc.
"John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:FFCD1719-BA31-49DE-BD31-BC52EF3AD898@.microsoft.com...[vbcol=seagreen]
> Hi Chris
> How do you create the file that makes up these SQL Statements?
> John
> "ChrisR" wrote:
>
from[vbcol=seagreen]
to[vbcol=seagreen]
the[vbcol=seagreen]
Inserts/[vbcol=seagreen]
Are[vbcol=seagreen]|||Hi
I guess you could get it changed to produce updategrams and use SQLXML, but
it would be far easier and quicker just to dump a datafile and load it
en-mass.
If you want to still with the SQL statements then you could look at loading
this into a table and then using a cursor and execute statement to run them
(see books online for both), this would rely that each statement was less
than 8000 characters and contained no carriage return or line feeds.
John
"ChrisR" wrote:

> It is made up for me. A mainframe does some stuff, an ETL tool called Tree
> House does some stuff, etc.
>
> "John Bell" <jbellnewsposts@.hotmail.com> wrote in message
> news:FFCD1719-BA31-49DE-BD31-BC52EF3AD898@.microsoft.com...
> from
> to
> the
> Inserts/
> Are
>
>|||"ChrisR" <ChrisR@.noEmail.com> wrote in message
news:%230AAwWdoGHA.4776@.TK2MSFTNGP03.phx.gbl...
> I appreciare your ideas, but Im confused.
>
> command
> I really need to do this from within SQL Server.
Why?
In any case, your best bet is probably BULK INSERT.

>
> without
> INSERT
> How would this be any better than what I have? I still need to get it from
> the file into a table. The way Im reading this, I need to go from file to
> table, then from table to table?
Yes and no. You can go from file->table.
What Chris is suggesting is a staging table (presumably w/o indexes) because
this will make the actual load from the file faster.
It's not necessary, but sometimes can improve performance and overall
maintenance.

>
> "John Bell" <jbellnewsposts@.hotmail.com> wrote in message
> news:C37FBE46-03DF-4FF8-9F53-35532DECCE64@.microsoft.com...
> command,
the[vbcol=seagreen]
> staging
> without
> INSERT
Inserts/[vbcol=seagreen]
> Stored
> there
>