Showing posts with label load. Show all posts
Showing posts with label load. Show all posts

Thursday, March 29, 2012

ETL and SSIS/SSAS roles in it all.

I am a student in a data warehousing class. This is my only experience using
SQL Server 2005 BI ETL tools. I need to extract, transform, and load into a
data warehouse using SQL Server 2005. I've extracted all tables from my 3nf
d.b. into flat files using SSIS. My question is about the connection
between SSIS and SSAS, and how I create the time dimension (in SSAS?) and
then have it connect back to my process in SSIS so I can do the final load
into it.
I thought I could use SSIS for the entire ETL process. Now I'm discovering
I somehow need to bring in SSAS to define the dimensions and somehow finish
up in SSIS again.
I defined my source d.b. and final destination data warehouse in regular SQL
Server tables. Do I need to delete the final destination data warehouse
database and redefine it as dimensions in SSAS?
Any help on this is appreciated. If you even have a recommended tutorial
that would help as well.
Hello Laura,
there are a number of ways to skin this cat.
The way I usually do it is
Build a relational DB (Star Schema)
Build an SSAS Cube over the top of that schema.
Use SSIS to load up the relation DB
In 2K5 you also though can load the SSAS partitions and dimensions directly
through SSIS which is way cool.

Allan Mitchell
http://wiki.sqlis.com | http://www.sqlis.com | http://www.sqldts.com |
http://www.konesans.com

> I am a student in a data warehousing class. This is my only experience
> using SQL Server 2005 BI ETL tools. I need to extract, transform, and
> load into a data warehouse using SQL Server 2005. I've extracted all
> tables from my 3nf d.b. into flat files using SSIS. My question is
> about the connection between SSIS and SSAS, and how I create the time
> dimension (in SSAS?) and then have it connect back to my process in
> SSIS so I can do the final load into it.
> I thought I could use SSIS for the entire ETL process. Now I'm
> discovering I somehow need to bring in SSAS to define the dimensions
> and somehow finish up in SSIS again.
> I defined my source d.b. and final destination data warehouse in
> regular SQL Server tables. Do I need to delete the final destination
> data warehouse database and redefine it as dimensions in SSAS?
> Any help on this is appreciated. If you even have a recommended
> tutorial that would help as well.
>
sql

ETL and SSIS/SSAS roles in it all.

I am a student in a data warehousing class. This is my only experience using
SQL Server 2005 BI ETL tools. I need to extract, transform, and load into a
data warehouse using SQL Server 2005. I've extracted all tables from my 3nf
d.b. into flat files using SSIS. My question is about the connection
between SSIS and SSAS, and how I create the time dimension (in SSAS?) and
then have it connect back to my process in SSIS so I can do the final load
into it.
I thought I could use SSIS for the entire ETL process. Now I'm discovering
I somehow need to bring in SSAS to define the dimensions and somehow finish
up in SSIS again.
I defined my source d.b. and final destination data warehouse in regular SQL
Server tables. Do I need to delete the final destination data warehouse
database and redefine it as dimensions in SSAS?
Any help on this is appreciated. If you even have a recommended tutorial
that would help as well.Hello Laura,
there are a number of ways to skin this cat.
The way I usually do it is
Build a relational DB (Star Schema)
Build an SSAS Cube over the top of that schema.
Use SSIS to load up the relation DB
In 2K5 you also though can load the SSAS partitions and dimensions directly
through SSIS which is way cool.
Allan Mitchell
http://wiki.sqlis.com | http://www.sqlis.com | http://www.sqldts.com |
http://www.konesans.com

> I am a student in a data warehousing class. This is my only experience
> using SQL Server 2005 BI ETL tools. I need to extract, transform, and
> load into a data warehouse using SQL Server 2005. I've extracted all
> tables from my 3nf d.b. into flat files using SSIS. My question is
> about the connection between SSIS and SSAS, and how I create the time
> dimension (in SSAS?) and then have it connect back to my process in
> SSIS so I can do the final load into it.
> I thought I could use SSIS for the entire ETL process. Now I'm
> discovering I somehow need to bring in SSAS to define the dimensions
> and somehow finish up in SSIS again.
> I defined my source d.b. and final destination data warehouse in
> regular SQL Server tables. Do I need to delete the final destination
> data warehouse database and redefine it as dimensions in SSAS?
> Any help on this is appreciated. If you even have a recommended
> tutorial that would help as well.
>

Monday, March 19, 2012

Errors loading a text file into a Sql Server Destination

I am trying to load 14+ million rows from a text file into local Sql Server. I tried using Sql Server destination because it seemed to be faster, but after about 5 million rows it would always fail. See various errors below which I received while trying different variations of FirstRow/LastRow, Timeout, Table lock etc. After spending two days trying to get it to work, I switched to OLE DB Destination and it worked fine. I would like to get the Sql Server Destination working because it seems much faster, but the error messages aren't much help. Any ideas on how to fix?

Also, when I wanted to try just loading a small sample by specifying first row/last row, it would get to the upper limit and then picked up speed and looked like it kept on reading rows of the source file until it failed. I expected it to just reach the limit I set and then stop processing.

[SS_DST tlkpDNBGlobal [41234]] Error: An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Reading from DTS buffer timed out.".

--
[SS_DST tlkpDNBGlobal [41234]] Error: The attempt to send a row to SQL Server failed with error code 0x80004005.
[DTS.Pipeline] Error: The ProcessInput method on component "SS_DST tlkpDNBGlobal" (41234) failed with error code 0xC02020C7. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
...
[FF_SRC DNBGlobal [6899]] Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
[DTS.Pipeline] Error: The PrimeOutput method on component "FF_SRC DNBGlobal" (6899) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

[DTS.Pipeline] Error: Thread "WorkThread1" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.


-
After first row/last row (from 1 to 1000000) limit is reached:
[SS_DST tlkpDNBGlobal [41234]] Error: An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Reading from DTS buffer timed out.".


When trying to do a MaximumCommit = 1000000. Runs up to 1000000 OK then slows down and then error.
[SS_DST tlkpDNBGlobal [41234]] Error: Unable to prepare the SSIS bulk insert for data insertion.

[DTS.Pipeline] Error: The PrimeOutput method on component "FF_SRC DNBGlobal" (6899) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

-
When attempting all in a single batch:
[OLE_DST tlkpDNBGlobal [57133]] Error: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "The transaction log for database 'tempdb' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Could not allocate space for object 'dbo.SORT temporary run storage: 156362715561984' in database 'tempdb' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.".

The commit size is most likely the problem. SQL Server will be storing those rows in memory until it is time to commit them. I suggest trying different commit sizes until you find one that is optimum.

You tried 1M? Try 500,000 rows - if that works, maybe try more, or if it fails try less. You should find a good value fairly quickly.

if this does not help, let us know more.

Donald Farmer

|||Thanks Donald. I set the MaxInsertCommitSize to 10K and it worked fine.|||

Hi, Donald

I have had the same problem, how / where do i set the commit size ?

Thanks

Agbeko

|||

It wasn't available until SP1. You need to have SP1 installed then you will get MaxInsertCommitSize as a property of SQL Server Destination.

Thanks

Sutha

|||

Thanks for pointing that out Sutha. Remember also that commit size has always been available on the OLEDB Destination with the SQL Server provider.

Donald

|||

Hi, Sutha,

I am new to SQL Server world. Dummy question:

refering to SP1, is it for Windows, SQL Server 2005, or Integration Service?

Thanks

Chuan

|||

They are talking about SP1 for SQL Server 2005

Rafael Salas

|||Thanks Rafael|||

Rafael Salas wrote:

They are talking about SP1 for SQL Server 2005

Rafael Salas

Which includes Integration Services by the way.

John, the only dumb questions are the ones that don't get asked.

-Jamie

Errors loading a text file into a Sql Server Destination

I am trying to load 14+ million rows from a text file into local Sql Server. I tried using Sql Server destination because it seemed to be faster, but after about 5 million rows it would always fail. See various errors below which I received while trying different variations of FirstRow/LastRow, Timeout, Table lock etc. After spending two days trying to get it to work, I switched to OLE DB Destination and it worked fine. I would like to get the Sql Server Destination working because it seems much faster, but the error messages aren't much help. Any ideas on how to fix?

Also, when I wanted to try just loading a small sample by specifying first row/last row, it would get to the upper limit and then picked up speed and looked like it kept on reading rows of the source file until it failed. I expected it to just reach the limit I set and then stop processing.

[SS_DST tlkpDNBGlobal [41234]] Error: An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Reading from DTS buffer timed out.".

--
[SS_DST tlkpDNBGlobal [41234]] Error: The attempt to send a row to SQL Server failed with error code 0x80004005.
[DTS.Pipeline] Error: The ProcessInput method on component "SS_DST tlkpDNBGlobal" (41234) failed with error code 0xC02020C7. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
...
[FF_SRC DNBGlobal [6899]] Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
[DTS.Pipeline] Error: The PrimeOutput method on component "FF_SRC DNBGlobal" (6899) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

[DTS.Pipeline] Error: Thread "WorkThread1" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.


-
After first row/last row (from 1 to 1000000) limit is reached:
[SS_DST tlkpDNBGlobal [41234]] Error: An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Reading from DTS buffer timed out.".


When trying to do a MaximumCommit = 1000000. Runs up to 1000000 OK then slows down and then error.
[SS_DST tlkpDNBGlobal [41234]] Error: Unable to prepare the SSIS bulk insert for data insertion.

[DTS.Pipeline] Error: The PrimeOutput method on component "FF_SRC DNBGlobal" (6899) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

-
When attempting all in a single batch:
[OLE_DST tlkpDNBGlobal [57133]] Error: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "The transaction log for database 'tempdb' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Could not allocate space for object 'dbo.SORT temporary run storage: 156362715561984' in database 'tempdb' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.".

The commit size is most likely the problem. SQL Server will be storing those rows in memory until it is time to commit them. I suggest trying different commit sizes until you find one that is optimum.

You tried 1M? Try 500,000 rows - if that works, maybe try more, or if it fails try less. You should find a good value fairly quickly.

if this does not help, let us know more.

Donald Farmer

|||Thanks Donald. I set the MaxInsertCommitSize to 10K and it worked fine.|||

Hi, Donald

I have had the same problem, how / where do i set the commit size ?

Thanks

Agbeko

|||

It wasn't available until SP1. You need to have SP1 installed then you will get MaxInsertCommitSize as a property of SQL Server Destination.

Thanks

Sutha

|||

Thanks for pointing that out Sutha. Remember also that commit size has always been available on the OLEDB Destination with the SQL Server provider.

Donald

|||

Hi, Sutha,

I am new to SQL Server world. Dummy question:

refering to SP1, is it for Windows, SQL Server 2005, or Integration Service?

Thanks

Chuan

|||

They are talking about SP1 for SQL Server 2005

Rafael Salas

|||Thanks Rafael|||

Rafael Salas wrote:

They are talking about SP1 for SQL Server 2005

Rafael Salas

Which includes Integration Services by the way.

John, the only dumb questions are the ones that don't get asked.

-Jamie

Errors loading a text file into a Sql Server Destination

I am trying to load 14+ million rows from a text file into local Sql Server. I tried using Sql Server destination because it seemed to be faster, but after about 5 million rows it would always fail. See various errors below which I received while trying different variations of FirstRow/LastRow, Timeout, Table lock etc. After spending two days trying to get it to work, I switched to OLE DB Destination and it worked fine. I would like to get the Sql Server Destination working because it seems much faster, but the error messages aren't much help. Any ideas on how to fix?

Also, when I wanted to try just loading a small sample by specifying first row/last row, it would get to the upper limit and then picked up speed and looked like it kept on reading rows of the source file until it failed. I expected it to just reach the limit I set and then stop processing.

[SS_DST tlkpDNBGlobal [41234]] Error: An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Reading from DTS buffer timed out.".

--
[SS_DST tlkpDNBGlobal [41234]] Error: The attempt to send a row to SQL Server failed with error code 0x80004005.
[DTS.Pipeline] Error: The ProcessInput method on component "SS_DST tlkpDNBGlobal" (41234) failed with error code 0xC02020C7. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
...
[FF_SRC DNBGlobal [6899]] Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
[DTS.Pipeline] Error: The PrimeOutput method on component "FF_SRC DNBGlobal" (6899) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

[DTS.Pipeline] Error: Thread "WorkThread1" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.


-
After first row/last row (from 1 to 1000000) limit is reached:
[SS_DST tlkpDNBGlobal [41234]] Error: An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Reading from DTS buffer timed out.".


When trying to do a MaximumCommit = 1000000. Runs up to 1000000 OK then slows down and then error.
[SS_DST tlkpDNBGlobal [41234]] Error: Unable to prepare the SSIS bulk insert for data insertion.

[DTS.Pipeline] Error: The PrimeOutput method on component "FF_SRC DNBGlobal" (6899) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

-
When attempting all in a single batch:
[OLE_DST tlkpDNBGlobal [57133]] Error: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "The transaction log for database 'tempdb' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Could not allocate space for object 'dbo.SORT temporary run storage: 156362715561984' in database 'tempdb' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.".

The commit size is most likely the problem. SQL Server will be storing those rows in memory until it is time to commit them. I suggest trying different commit sizes until you find one that is optimum.

You tried 1M? Try 500,000 rows - if that works, maybe try more, or if it fails try less. You should find a good value fairly quickly.

if this does not help, let us know more.

Donald Farmer

|||Thanks Donald. I set the MaxInsertCommitSize to 10K and it worked fine.|||

Hi, Donald

I have had the same problem, how / where do i set the commit size ?

Thanks

Agbeko

|||

It wasn't available until SP1. You need to have SP1 installed then you will get MaxInsertCommitSize as a property of SQL Server Destination.

Thanks

Sutha

|||

Thanks for pointing that out Sutha. Remember also that commit size has always been available on the OLEDB Destination with the SQL Server provider.

Donald

|||

Hi, Sutha,

I am new to SQL Server world. Dummy question:

refering to SP1, is it for Windows, SQL Server 2005, or Integration Service?

Thanks

Chuan

|||

They are talking about SP1 for SQL Server 2005

Rafael Salas

|||Thanks Rafael|||

Rafael Salas wrote:

They are talking about SP1 for SQL Server 2005

Rafael Salas

Which includes Integration Services by the way.

John, the only dumb questions are the ones that don't get asked.

-Jamie

Friday, March 9, 2012

Errors during DTS package

We have been getting an error on a DTS package
that is trying to load about 800,000 records.
It used to work without problems.
Now.. It will fail anywhere between record 8000
and record 160,000.
It is not one set record. Here is the error message.
[DBNETLIB][CONNECTIONWRITE (SEND()).]
GENERAL NETWORK ERROR
It will get this error even when the server itself
runs the package, and the file is on the server's
disk drive.
Does anyone have any ideas?Looks to me like the target SQL Server has re-set the sql connection due to
severe error of some sort - either sql engine error or windows error
perhaps. Have you checked the server logs on the target server - both SQL &
Windows logs to see if there's any further information available?
This could be such a wide variety of things - even disk failure or table /
index corruption within SQL Server. You might try running a dbcc checkdb on
the target server as well as part of your investigations.
Regards,
Greg Linwood
SQL Server MVP
"Scot" <sstyer@.alltel.net> wrote in message
news:46df01c42bce$62d54d60$a401280a@.phx.gbl...
> We have been getting an error on a DTS package
> that is trying to load about 800,000 records.
> It used to work without problems.
> Now.. It will fail anywhere between record 8000
> and record 160,000.
> It is not one set record. Here is the error message.
> [DBNETLIB][CONNECTIONWRITE (SEND()).]
> GENERAL NETWORK ERROR
> It will get this error even when the server itself
> runs the package, and the file is on the server's
> disk drive.
> Does anyone have any ideas?
>|||We aren't seeing much in the event log. Nothing to
indicate a major problem. The server log is just
showing the process being killed. Is there any way We
can get a more meaningful error.

>--Original Message--
>Looks to me like the target SQL Server has re-set the sql
connection due to
>severe error of some sort - either sql engine error or
windows error
>perhaps. Have you checked the server logs on the target
server - both SQL &
>Windows logs to see if there's any further information
available?
>This could be such a wide variety of things - even disk
failure or table /
>index corruption within SQL Server. You might try running
a dbcc checkdb on
>the target server as well as part of your investigations.
>Regards,
>Greg Linwood
>SQL Server MVP
>"Scot" <sstyer@.alltel.net> wrote in message
>news:46df01c42bce$62d54d60$a401280a@.phx.gbl...
>
>.
>|||Have you got the package logging configured under Properties / Logging /
Errorfile?
Regards,
Greg Linwood
SQL Server MVP
<anonymous@.discussions.microsoft.com> wrote in message
news:4d2d01c42c66$7df1ee30$a401280a@.phx.gbl...[vbcol=seagreen]
> We aren't seeing much in the event log. Nothing to
> indicate a major problem. The server log is just
> showing the process being killed. Is there any way We
> can get a more meaningful error.
>
>
> connection due to
> windows error
> server - both SQL &
> available?
> failure or table /
> a dbcc checkdb on

Errors during DTS package

We have been getting an error on a DTS package
that is trying to load about 800,000 records.
It used to work without problems.
Now.. It will fail anywhere between record 8000
and record 160,000.
It is not one set record. Here is the error message.
[DBNETLIB][CONNECTIONWRITE (SEND()).]
GENERAL NETWORK ERROR
It will get this error even when the server itself
runs the package, and the file is on the server's
disk drive.
Does anyone have any ideas?
Looks to me like the target SQL Server has re-set the sql connection due to
severe error of some sort - either sql engine error or windows error
perhaps. Have you checked the server logs on the target server - both SQL &
Windows logs to see if there's any further information available?
This could be such a wide variety of things - even disk failure or table /
index corruption within SQL Server. You might try running a dbcc checkdb on
the target server as well as part of your investigations.
Regards,
Greg Linwood
SQL Server MVP
"Scot" <sstyer@.alltel.net> wrote in message
news:46df01c42bce$62d54d60$a401280a@.phx.gbl...
> We have been getting an error on a DTS package
> that is trying to load about 800,000 records.
> It used to work without problems.
> Now.. It will fail anywhere between record 8000
> and record 160,000.
> It is not one set record. Here is the error message.
> [DBNETLIB][CONNECTIONWRITE (SEND()).]
> GENERAL NETWORK ERROR
> It will get this error even when the server itself
> runs the package, and the file is on the server's
> disk drive.
> Does anyone have any ideas?
>
|||We aren't seeing much in the event log. Nothing to
indicate a major problem. The server log is just
showing the process being killed. Is there any way We
can get a more meaningful error.

>--Original Message--
>Looks to me like the target SQL Server has re-set the sql
connection due to
>severe error of some sort - either sql engine error or
windows error
>perhaps. Have you checked the server logs on the target
server - both SQL &
>Windows logs to see if there's any further information
available?
>This could be such a wide variety of things - even disk
failure or table /
>index corruption within SQL Server. You might try running
a dbcc checkdb on
>the target server as well as part of your investigations.
>Regards,
>Greg Linwood
>SQL Server MVP
>"Scot" <sstyer@.alltel.net> wrote in message
>news:46df01c42bce$62d54d60$a401280a@.phx.gbl...
>
>.
>
|||Have you got the package logging configured under Properties / Logging /
Errorfile?
Regards,
Greg Linwood
SQL Server MVP
<anonymous@.discussions.microsoft.com> wrote in message
news:4d2d01c42c66$7df1ee30$a401280a@.phx.gbl...[vbcol=seagreen]
> We aren't seeing much in the event log. Nothing to
> indicate a major problem. The server log is just
> showing the process being killed. Is there any way We
> can get a more meaningful error.
>
> connection due to
> windows error
> server - both SQL &
> available?
> failure or table /
> a dbcc checkdb on

Errors during DTS package

We have been getting an error on a DTS package
that is trying to load about 800,000 records.
It used to work without problems.
Now.. It will fail anywhere between record 8000
and record 160,000.
It is not one set record. Here is the error message.
[DBNETLIB][CONNECTIONWRITE (SEND()).]
GENERAL NETWORK ERROR
It will get this error even when the server itself
runs the package, and the file is on the server's
disk drive.
Does anyone have any ideas?Looks to me like the target SQL Server has re-set the sql connection due to
severe error of some sort - either sql engine error or windows error
perhaps. Have you checked the server logs on the target server - both SQL &
Windows logs to see if there's any further information available?
This could be such a wide variety of things - even disk failure or table /
index corruption within SQL Server. You might try running a dbcc checkdb on
the target server as well as part of your investigations.
Regards,
Greg Linwood
SQL Server MVP
"Scot" <sstyer@.alltel.net> wrote in message
news:46df01c42bce$62d54d60$a401280a@.phx.gbl...
> We have been getting an error on a DTS package
> that is trying to load about 800,000 records.
> It used to work without problems.
> Now.. It will fail anywhere between record 8000
> and record 160,000.
> It is not one set record. Here is the error message.
> [DBNETLIB][CONNECTIONWRITE (SEND()).]
> GENERAL NETWORK ERROR
> It will get this error even when the server itself
> runs the package, and the file is on the server's
> disk drive.
> Does anyone have any ideas?
>|||We aren't seeing much in the event log. Nothing to
indicate a major problem. The server log is just
showing the process being killed. Is there any way We
can get a more meaningful error.
>--Original Message--
>Looks to me like the target SQL Server has re-set the sql
connection due to
>severe error of some sort - either sql engine error or
windows error
>perhaps. Have you checked the server logs on the target
server - both SQL &
>Windows logs to see if there's any further information
available?
>This could be such a wide variety of things - even disk
failure or table /
>index corruption within SQL Server. You might try running
a dbcc checkdb on
>the target server as well as part of your investigations.
>Regards,
>Greg Linwood
>SQL Server MVP
>"Scot" <sstyer@.alltel.net> wrote in message
>news:46df01c42bce$62d54d60$a401280a@.phx.gbl...
>> We have been getting an error on a DTS package
>> that is trying to load about 800,000 records.
>> It used to work without problems.
>> Now.. It will fail anywhere between record 8000
>> and record 160,000.
>> It is not one set record. Here is the error message.
>> [DBNETLIB][CONNECTIONWRITE (SEND()).]
>> GENERAL NETWORK ERROR
>> It will get this error even when the server itself
>> runs the package, and the file is on the server's
>> disk drive.
>> Does anyone have any ideas?
>
>.
>|||Have you got the package logging configured under Properties / Logging /
Errorfile?
Regards,
Greg Linwood
SQL Server MVP
<anonymous@.discussions.microsoft.com> wrote in message
news:4d2d01c42c66$7df1ee30$a401280a@.phx.gbl...
> We aren't seeing much in the event log. Nothing to
> indicate a major problem. The server log is just
> showing the process being killed. Is there any way We
> can get a more meaningful error.
>
> >--Original Message--
> >Looks to me like the target SQL Server has re-set the sql
> connection due to
> >severe error of some sort - either sql engine error or
> windows error
> >perhaps. Have you checked the server logs on the target
> server - both SQL &
> >Windows logs to see if there's any further information
> available?
> >
> >This could be such a wide variety of things - even disk
> failure or table /
> >index corruption within SQL Server. You might try running
> a dbcc checkdb on
> >the target server as well as part of your investigations.
> >
> >Regards,
> >Greg Linwood
> >SQL Server MVP
> >
> >"Scot" <sstyer@.alltel.net> wrote in message
> >news:46df01c42bce$62d54d60$a401280a@.phx.gbl...
> >> We have been getting an error on a DTS package
> >> that is trying to load about 800,000 records.
> >> It used to work without problems.
> >> Now.. It will fail anywhere between record 8000
> >> and record 160,000.
> >> It is not one set record. Here is the error message.
> >>
> >> [DBNETLIB][CONNECTIONWRITE (SEND()).]
> >> GENERAL NETWORK ERROR
> >>
> >> It will get this error even when the server itself
> >> runs the package, and the file is on the server's
> >> disk drive.
> >>
> >> Does anyone have any ideas?
> >>
> >
> >
> >.
> >

Friday, February 24, 2012

Error:0 cannot load the DLL xpstar.dll

Have recently started getting this error in Sql2000 on W2KAdvanceServer.
Wasn't getting the error until about 2 weeks ago. Have not added any
software, drivers etc. Have verified the xpstar.dll is in the correct
directory and is accessible. Have uninstalled and reinstalled Sql 2000 and
the appropriate service packs twice and that has not helped. Have plenty hd
space and ram. Seems to be no Microsoft paper or troubleshooting on this
error -but there are several people who have experienced it and report it on
other sql user groups (however no resolutions). Any idea?
Thanks
Victoria
Does installing/re-installing MDAC 2.8 help?
"MSSQL" <colemanv@.stripes.osd.mil> wrote in message
news:uP$AkiynEHA.596@.TK2MSFTNGP11.phx.gbl...
> Have recently started getting this error in Sql2000 on W2KAdvanceServer.
> Wasn't getting the error until about 2 weeks ago. Have not added any
> software, drivers etc. Have verified the xpstar.dll is in the correct
> directory and is accessible. Have uninstalled and reinstalled Sql 2000
> and
> the appropriate service packs twice and that has not helped. Have plenty
> hd
> space and ram. Seems to be no Microsoft paper or troubleshooting on this
> error -but there are several people who have experienced it and report it
> on
> other sql user groups (however no resolutions). Any idea?
> Thanks
> Victoria
>
|||Is MDAC 2.8 part of the SP3 service pack? or part of SP2? Sorry to be
ignorant on this but I did see MDAC being updated when I installed SP3.
How would I get it back to MDAC 2.8 without reinstalling? Once again -
I aplogize for asking what is most likely a stupid question.
*** Sent via Developersdex http://www.codecomments.com ***
Don't just participate in USENET...get rewarded for it!
|||You can go to Microsoft.com and in the upper right search Microsoft.com for
MDAC. Second result should be MDAC 2.8 download. I have no idea if this
will solve the problem, but I do know it corrects a lot of problems I've had
with SQL Server.
This link will go directly to the MDAC download page also:
http://www.microsoft.com/downloads/d...DisplayLang=en
"Victoria Coleman" <colemanv@.stripes.osd.mil> wrote in message
news:%23pV5IDznEHA.3396@.tk2msftngp13.phx.gbl...
> Is MDAC 2.8 part of the SP3 service pack? or part of SP2? Sorry to be
> ignorant on this but I did see MDAC being updated when I installed SP3.
> How would I get it back to MDAC 2.8 without reinstalling? Once again -
> I aplogize for asking what is most likely a stupid question.
>
> *** Sent via Developersdex http://www.codecomments.com ***
> Don't just participate in USENET...get rewarded for it!
|||I am loading MDAC 2.8 now and will see if that fixes the problem.
Thanks for responding so quickly. I really appreciate it.
*** Sent via Developersdex http://www.codecomments.com ***
Don't just participate in USENET...get rewarded for it!

Error:0 cannot load the DLL xpstar.dll

Have recently started getting this error in Sql2000 on W2KAdvanceServer.
Wasn't getting the error until about 2 weeks ago. Have not added any
software, drivers etc. Have verified the xpstar.dll is in the correct
directory and is accessible. Have uninstalled and reinstalled Sql 2000 and
the appropriate service packs twice and that has not helped. Have plenty hd
space and ram. Seems to be no Microsoft paper or troubleshooting on this
error -but there are several people who have experienced it and report it on
other sql user groups (however no resolutions). Any idea?
Thanks
VictoriaDoes installing/re-installing MDAC 2.8 help?
"MSSQL" <colemanv@.stripes.osd.mil> wrote in message
news:uP$AkiynEHA.596@.TK2MSFTNGP11.phx.gbl...
> Have recently started getting this error in Sql2000 on W2KAdvanceServer.
> Wasn't getting the error until about 2 weeks ago. Have not added any
> software, drivers etc. Have verified the xpstar.dll is in the correct
> directory and is accessible. Have uninstalled and reinstalled Sql 2000
> and
> the appropriate service packs twice and that has not helped. Have plenty
> hd
> space and ram. Seems to be no Microsoft paper or troubleshooting on this
> error -but there are several people who have experienced it and report it
> on
> other sql user groups (however no resolutions). Any idea?
> Thanks
> Victoria
>

Error: Unable to load one or more of the requested types.

When I'm trying to deploy a SQLCLR assembly , I get the following error:

Error: Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information.

This is an error you get were you cannot get the assembly types during reflection.

I was able to deploy it previously, and I'm not aware of any change I made that could make it refer to a different external type.

It has a reference to another SQLCLR assembly that was successfully deployed. If I remove all the references in in the code to classes in that assembly, it works OK, so it looks that it cannot load this other assembly when trying to deploy the first one.

Is there any way to troubleshoot this? Getting that 'LoaderExceptions property' could be useful..

Any other hint?

Thanks

Are you using reflection against types that are in assemblies that are also registered in the database? or these are assemblies that are were not registered using CREATE ASSEMBLY? if it's the later, that's by design, you cannot load assemblies that were not registered with SQL Server.

Pablo Castro
Program Manager - ADO.NET Team
Microsoft Corp.

|||Thanks Pablo.

I run into this issue again today, and it looks like a VS.NET thing.

I create two projects, one with an enumeration:

public enum MyEnum
{
MyValue,
MyValue2
}

and another project with a reference to the first one, that declares a variable of that enumeration type

MyEnum myEnum;

Then when I try to deploy the second project, I get

Error: Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information.

But, I execute the CREATE ASSEMBLY commands, then it works OK. The weird thing is that in my real world case, I could deploy the assemblies without any problem and now I cannot.

So, it looks like a VS.NET issue. I'm not sure which is the right forum to post it ;)

|||Please make sure that the top level assembly was compiled against the version of the referenced assembly that is in the assembly.

- Balaji Rathakrishnan|||It looks that it was caused by a versioning issue. Even if I tried that before, it looks that setting a '1.0.0.0' version fixed the problem.

Thanks a lot.

Andres|||

Sir,

i am also facing the same problem.

first i refered the assembly from local. at that time it is working properly.

but now i moved assemblies to GAC ,now it shows the above mentioned error.

i seen the abve answers.

can u tell me how to run the CREATE ASSEMBLY commend.

thanks

Error: Unable to load one or more of the requested types.

When I'm trying to deploy a SQLCLR assembly , I get the following error:

Error: Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information.

This is an error you get were you cannot get the assembly types during reflection.

I was able to deploy it previously, and I'm not aware of any change I made that could make it refer to a different external type.

It has a reference to another SQLCLR assembly that was successfully deployed. If I remove all the references in in the code to classes in that assembly, it works OK, so it looks that it cannot load this other assembly when trying to deploy the first one.

Is there any way to troubleshoot this? Getting that 'LoaderExceptions property' could be useful..

Any other hint?

Thanks

Are you using reflection against types that are in assemblies that are also registered in the database? or these are assemblies that are were not registered using CREATE ASSEMBLY? if it's the later, that's by design, you cannot load assemblies that were not registered with SQL Server.

Pablo Castro
Program Manager - ADO.NET Team
Microsoft Corp.

|||Thanks Pablo.

I run into this issue again today, and it looks like a VS.NET thing.

I create two projects, one with an enumeration:

public enum MyEnum
{
MyValue,
MyValue2
}

and another project with a reference to the first one, that declares a variable of that enumeration type

MyEnum myEnum;

Then when I try to deploy the second project, I get

Error: Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information.

But, I execute the CREATE ASSEMBLY commands, then it works OK. The weird thing is that in my real world case, I could deploy the assemblies without any problem and now I cannot.

So, it looks like a VS.NET issue. I'm not sure which is the right forum to post it ;)

|||Please make sure that the top level assembly was compiled against the version of the referenced assembly that is in the assembly.

- Balaji Rathakrishnan|||It looks that it was caused by a versioning issue. Even if I tried that before, it looks that setting a '1.0.0.0' version fixed the problem.

Thanks a lot.

Andres|||

Sir,

i am also facing the same problem.

first i refered the assembly from local. at that time it is working properly.

but now i moved assemblies to GAC ,now it shows the above mentioned error.

i seen the abve answers.

can u tell me how to run the CREATE ASSEMBLY commend.

thanks