Showing posts with label text. Show all posts
Showing posts with label text. Show all posts

Tuesday, March 27, 2012

Estimate time to populate full text index

Can someone give me an estimate time to populate a full-
text index, SQL Server 2000:
-14 million rows,
- each row is about 1 paragraph long. (approx 400 words and is stored in a LONG TEXT field)
- Language of the text: mixed.
- W.H.:a single CPU 2.8GHZ with 2GB Ram and have 300GB disk space
-incremental.
it has been running for 3 days now and who knows where we are...
Any way to speed up..
Thanks
mustafa,
Given your hardware configuration, and a table with 14 million rows will
take a substitational amount of time, approx 5 to 8 days.
Your best bet is to attempt to stop the Full or Incremental population and
consider horzatioanlly partitioning the table into several smaller tables,
perhaps broken up by date or PK range. If you plan to continue this, I'd
highly recommend that you consider upgrading all aspects of your hardware.
Specifically, consider a multiple CPU server with 2 or more GB of RAM with
your FT Catalogs on a separate disk controller and disk array configured as
RAID0 or RAID10. You should also set the resource_usage level of the
MSSearch service to 5 via sp_fulltext_service 'resource_usage', 5. However,
this will not help you now. I'd also recommend that you review the SQL
Server 2000 BOL title "Full-text Search Recommendations" and the following
FT Deployment white paper: INF: SQL Server 2000 Full-Text Search Deployment
White Paper at:
http://support.microsoft.com/default...b;en-us;323739
Regards,
John
"mustafa jarrar" <anonymous@.discussions.microsoft.com> wrote in message
news:6EC893FD-DD03-4FA1-BE01-052A19EB03AB@.microsoft.com...
> Can someone give me an estimate time to populate a full-
> text index, SQL Server 2000:
> -14 million rows,
> - each row is about 1 paragraph long. (approx 400 words and is stored in a
LONG TEXT field)
> - Language of the text: mixed.
> - W.H.:a single CPU 2.8GHZ with 2GB Ram and have 300GB disk space
> -incremental.
> it has been running for 3 days now and who knows where we are...
> Any way to speed up..
> Thanks
>

estimate on how long it might take to full-text index a table with 21,000 rows?

i need to full-text index a table so that i can easily search the text fields of that table.. the table has about 21,000 rows, and i was wondering how long it might take to full-text index it?
thanksBetween four seconds and three years, depending on hardware configuration, table contents, and server load.

On a (very slightly) more serious note, I don't know of any way to give you a meaningful estimate.

-PatP

Monday, March 26, 2012

Escaping [characters with text lengths over 4000 characters

In a previous post I mentioned you will get into problems when trying to
update DataSets containing rows with [ characters in ntext or text field
columns. Andrew Conrad (thanks) mentioned that you should escape these, like
so [[]. This will help you, except when the length of the value is over 4000
characters (for ntext) in length. After that the problem of zero affected
rows arises again, despite escaping. How come? How to solve?
Thanks.
AlexThe REPLACE fuction (and other SQL Server string functions) will not operate
on data larger than 8000 bytes -- for an NVARCHAR datatype, that means 4000
characters (2 bytes per character). Can you perform the replace on the
client, before passing the data to SQL Server?
Adam Machanic
SQL Server MVP
http://www.sqljunkies.com/weblog/amachanic
--
"Alex Thissen" <athissen A T killer-apps.nl> wrote in message
news:uNIrcZPFFHA.3284@.TK2MSFTNGP09.phx.gbl...
> In a previous post I mentioned you will get into problems when trying to
> update DataSets containing rows with [ characters in ntext or text field
> columns. Andrew Conrad (thanks) mentioned that you should escape these,
like
> so [[]. This will help you, except when the length of the value is over
4000
> characters (for ntext) in length. After that the problem of zero affected
> rows arises again, despite escaping. How come? How to solve?
> Thanks.
> Alex
>|||Hi Adam,
Thanks for thinking with me on this one. I don't use the REPLACE functions.
Instead, I make sure that the OriginalVersion of my DataRow in the DataSet
has the replaced value. Then the SqlXmlAdapter builds the UPDATE statement
for me, but it uses an optimistic locking scheme by comparing all columns
with the original values.For the (n)text fields the LIKE operator is used,
but as I mentioned this one breaks with values over 8000 bytes and [ chars
in it. I also wrote a (teasing) weblog entry on it, that you can find here:
http://www.alexthissen.nl/weblog/Pe...br />
8c42070.
You might want to read up on it, since this problem is hardly related to
SQLXML. It could be circumvented if there was a possibility to influence the
SQL that is generated. An option to exclude fields from the optimistic
locking would solve it directly.
So, my question remains, given that the LIKE operator breaks for lengths
over 8000 bytes WITH escaped [ characters in it ( as [[] ) in it, how do I
get my DataSets that contain such values to get updated through SQLXML?
Thanks, Alex
"Adam Machanic" <amachanic@.hotmail._removetoemail_.com> wrote in message
news:uvUE$$VFFHA.1348@.TK2MSFTNGP14.phx.gbl...
> The REPLACE fuction (and other SQL Server string functions) will not
> operate
> on data larger than 8000 bytes -- for an NVARCHAR datatype, that means
> 4000
> characters (2 bytes per character). Can you perform the replace on the
> client, before passing the data to SQL Server?
> --
> Adam Machanic
> SQL Server MVP
> http://www.sqljunkies.com/weblog/amachanic
> --
>
> "Alex Thissen" <athissen A T killer-apps.nl> wrote in message
> news:uNIrcZPFFHA.3284@.TK2MSFTNGP09.phx.gbl...
> like
> 4000
>sql

Escaping [ characters with text lengths over 4000 characters

In a previous post I mentioned you will get into problems when trying to
update DataSets containing rows with [ characters in ntext or text field
columns. Andrew Conrad (thanks) mentioned that you should escape these, like
so [[]. This will help you, except when the length of the value is over 4000
characters (for ntext) in length. After that the problem of zero affected
rows arises again, despite escaping. How come? How to solve?
Thanks.
Alex
The REPLACE fuction (and other SQL Server string functions) will not operate
on data larger than 8000 bytes -- for an NVARCHAR datatype, that means 4000
characters (2 bytes per character). Can you perform the replace on the
client, before passing the data to SQL Server?
Adam Machanic
SQL Server MVP
http://www.sqljunkies.com/weblog/amachanic
"Alex Thissen" <athissen A T killer-apps.nl> wrote in message
news:uNIrcZPFFHA.3284@.TK2MSFTNGP09.phx.gbl...
> In a previous post I mentioned you will get into problems when trying to
> update DataSets containing rows with [ characters in ntext or text field
> columns. Andrew Conrad (thanks) mentioned that you should escape these,
like
> so [[]. This will help you, except when the length of the value is over
4000
> characters (for ntext) in length. After that the problem of zero affected
> rows arises again, despite escaping. How come? How to solve?
> Thanks.
> Alex
>
|||Hi Adam,
Thanks for thinking with me on this one. I don't use the REPLACE functions.
Instead, I make sure that the OriginalVersion of my DataRow in the DataSet
has the replaced value. Then the SqlXmlAdapter builds the UPDATE statement
for me, but it uses an optimistic locking scheme by comparing all columns
with the original values.For the (n)text fields the LIKE operator is used,
but as I mentioned this one breaks with values over 8000 bytes and [ chars
in it. I also wrote a (teasing) weblog entry on it, that you can find here:
http://www.alexthissen.nl/weblog/Per...-8feda8c42070.
You might want to read up on it, since this problem is hardly related to
SQLXML. It could be circumvented if there was a possibility to influence the
SQL that is generated. An option to exclude fields from the optimistic
locking would solve it directly.
So, my question remains, given that the LIKE operator breaks for lengths
over 8000 bytes WITH escaped [ characters in it ( as [[] ) in it, how do I
get my DataSets that contain such values to get updated through SQLXML?
Thanks, Alex
"Adam Machanic" <amachanic@.hotmail._removetoemail_.com> wrote in message
news:uvUE$$VFFHA.1348@.TK2MSFTNGP14.phx.gbl...
> The REPLACE fuction (and other SQL Server string functions) will not
> operate
> on data larger than 8000 bytes -- for an NVARCHAR datatype, that means
> 4000
> characters (2 bytes per character). Can you perform the replace on the
> client, before passing the data to SQL Server?
> --
> Adam Machanic
> SQL Server MVP
> http://www.sqljunkies.com/weblog/amachanic
> --
>
> "Alex Thissen" <athissen A T killer-apps.nl> wrote in message
> news:uNIrcZPFFHA.3284@.TK2MSFTNGP09.phx.gbl...
> like
> 4000
>

Monday, March 19, 2012

Errors on importing a text file

When a DTS fails on a Text Source input with an error like "DTS_Transformation encountered an invalid data value for 'Column1' destination"

Is there a way to get the line number of the textfile where the import failed? It is hard to determine where in my 40,000-line file it found the invalid value for my column.

Thanks,
AndrewSpecify the LOG file in Package Properties/Logging.|||WHere does it store the logs on the server?|||Wherever you would like it to be.|||I set the error log file to a text file on my esktop - it seems to repeat the same error message (without the line number) - am i missing something?

Errors loading a text file into a Sql Server Destination

I am trying to load 14+ million rows from a text file into local Sql Server. I tried using Sql Server destination because it seemed to be faster, but after about 5 million rows it would always fail. See various errors below which I received while trying different variations of FirstRow/LastRow, Timeout, Table lock etc. After spending two days trying to get it to work, I switched to OLE DB Destination and it worked fine. I would like to get the Sql Server Destination working because it seems much faster, but the error messages aren't much help. Any ideas on how to fix?

Also, when I wanted to try just loading a small sample by specifying first row/last row, it would get to the upper limit and then picked up speed and looked like it kept on reading rows of the source file until it failed. I expected it to just reach the limit I set and then stop processing.

[SS_DST tlkpDNBGlobal [41234]] Error: An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Reading from DTS buffer timed out.".

--
[SS_DST tlkpDNBGlobal [41234]] Error: The attempt to send a row to SQL Server failed with error code 0x80004005.
[DTS.Pipeline] Error: The ProcessInput method on component "SS_DST tlkpDNBGlobal" (41234) failed with error code 0xC02020C7. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
...
[FF_SRC DNBGlobal [6899]] Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
[DTS.Pipeline] Error: The PrimeOutput method on component "FF_SRC DNBGlobal" (6899) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

[DTS.Pipeline] Error: Thread "WorkThread1" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.


-
After first row/last row (from 1 to 1000000) limit is reached:
[SS_DST tlkpDNBGlobal [41234]] Error: An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Reading from DTS buffer timed out.".


When trying to do a MaximumCommit = 1000000. Runs up to 1000000 OK then slows down and then error.
[SS_DST tlkpDNBGlobal [41234]] Error: Unable to prepare the SSIS bulk insert for data insertion.

[DTS.Pipeline] Error: The PrimeOutput method on component "FF_SRC DNBGlobal" (6899) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

-
When attempting all in a single batch:
[OLE_DST tlkpDNBGlobal [57133]] Error: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "The transaction log for database 'tempdb' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Could not allocate space for object 'dbo.SORT temporary run storage: 156362715561984' in database 'tempdb' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.".

The commit size is most likely the problem. SQL Server will be storing those rows in memory until it is time to commit them. I suggest trying different commit sizes until you find one that is optimum.

You tried 1M? Try 500,000 rows - if that works, maybe try more, or if it fails try less. You should find a good value fairly quickly.

if this does not help, let us know more.

Donald Farmer

|||Thanks Donald. I set the MaxInsertCommitSize to 10K and it worked fine.|||

Hi, Donald

I have had the same problem, how / where do i set the commit size ?

Thanks

Agbeko

|||

It wasn't available until SP1. You need to have SP1 installed then you will get MaxInsertCommitSize as a property of SQL Server Destination.

Thanks

Sutha

|||

Thanks for pointing that out Sutha. Remember also that commit size has always been available on the OLEDB Destination with the SQL Server provider.

Donald

|||

Hi, Sutha,

I am new to SQL Server world. Dummy question:

refering to SP1, is it for Windows, SQL Server 2005, or Integration Service?

Thanks

Chuan

|||

They are talking about SP1 for SQL Server 2005

Rafael Salas

|||Thanks Rafael|||

Rafael Salas wrote:

They are talking about SP1 for SQL Server 2005

Rafael Salas

Which includes Integration Services by the way.

John, the only dumb questions are the ones that don't get asked.

-Jamie

Errors loading a text file into a Sql Server Destination

I am trying to load 14+ million rows from a text file into local Sql Server. I tried using Sql Server destination because it seemed to be faster, but after about 5 million rows it would always fail. See various errors below which I received while trying different variations of FirstRow/LastRow, Timeout, Table lock etc. After spending two days trying to get it to work, I switched to OLE DB Destination and it worked fine. I would like to get the Sql Server Destination working because it seems much faster, but the error messages aren't much help. Any ideas on how to fix?

Also, when I wanted to try just loading a small sample by specifying first row/last row, it would get to the upper limit and then picked up speed and looked like it kept on reading rows of the source file until it failed. I expected it to just reach the limit I set and then stop processing.

[SS_DST tlkpDNBGlobal [41234]] Error: An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Reading from DTS buffer timed out.".

--
[SS_DST tlkpDNBGlobal [41234]] Error: The attempt to send a row to SQL Server failed with error code 0x80004005.
[DTS.Pipeline] Error: The ProcessInput method on component "SS_DST tlkpDNBGlobal" (41234) failed with error code 0xC02020C7. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
...
[FF_SRC DNBGlobal [6899]] Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
[DTS.Pipeline] Error: The PrimeOutput method on component "FF_SRC DNBGlobal" (6899) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

[DTS.Pipeline] Error: Thread "WorkThread1" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.


-
After first row/last row (from 1 to 1000000) limit is reached:
[SS_DST tlkpDNBGlobal [41234]] Error: An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Reading from DTS buffer timed out.".


When trying to do a MaximumCommit = 1000000. Runs up to 1000000 OK then slows down and then error.
[SS_DST tlkpDNBGlobal [41234]] Error: Unable to prepare the SSIS bulk insert for data insertion.

[DTS.Pipeline] Error: The PrimeOutput method on component "FF_SRC DNBGlobal" (6899) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

-
When attempting all in a single batch:
[OLE_DST tlkpDNBGlobal [57133]] Error: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "The transaction log for database 'tempdb' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Could not allocate space for object 'dbo.SORT temporary run storage: 156362715561984' in database 'tempdb' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.".

The commit size is most likely the problem. SQL Server will be storing those rows in memory until it is time to commit them. I suggest trying different commit sizes until you find one that is optimum.

You tried 1M? Try 500,000 rows - if that works, maybe try more, or if it fails try less. You should find a good value fairly quickly.

if this does not help, let us know more.

Donald Farmer

|||Thanks Donald. I set the MaxInsertCommitSize to 10K and it worked fine.|||

Hi, Donald

I have had the same problem, how / where do i set the commit size ?

Thanks

Agbeko

|||

It wasn't available until SP1. You need to have SP1 installed then you will get MaxInsertCommitSize as a property of SQL Server Destination.

Thanks

Sutha

|||

Thanks for pointing that out Sutha. Remember also that commit size has always been available on the OLEDB Destination with the SQL Server provider.

Donald

|||

Hi, Sutha,

I am new to SQL Server world. Dummy question:

refering to SP1, is it for Windows, SQL Server 2005, or Integration Service?

Thanks

Chuan

|||

They are talking about SP1 for SQL Server 2005

Rafael Salas

|||Thanks Rafael|||

Rafael Salas wrote:

They are talking about SP1 for SQL Server 2005

Rafael Salas

Which includes Integration Services by the way.

John, the only dumb questions are the ones that don't get asked.

-Jamie

Errors loading a text file into a Sql Server Destination

I am trying to load 14+ million rows from a text file into local Sql Server. I tried using Sql Server destination because it seemed to be faster, but after about 5 million rows it would always fail. See various errors below which I received while trying different variations of FirstRow/LastRow, Timeout, Table lock etc. After spending two days trying to get it to work, I switched to OLE DB Destination and it worked fine. I would like to get the Sql Server Destination working because it seems much faster, but the error messages aren't much help. Any ideas on how to fix?

Also, when I wanted to try just loading a small sample by specifying first row/last row, it would get to the upper limit and then picked up speed and looked like it kept on reading rows of the source file until it failed. I expected it to just reach the limit I set and then stop processing.

[SS_DST tlkpDNBGlobal [41234]] Error: An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Reading from DTS buffer timed out.".

--
[SS_DST tlkpDNBGlobal [41234]] Error: The attempt to send a row to SQL Server failed with error code 0x80004005.
[DTS.Pipeline] Error: The ProcessInput method on component "SS_DST tlkpDNBGlobal" (41234) failed with error code 0xC02020C7. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
...
[FF_SRC DNBGlobal [6899]] Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
[DTS.Pipeline] Error: The PrimeOutput method on component "FF_SRC DNBGlobal" (6899) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

[DTS.Pipeline] Error: Thread "WorkThread1" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.


-
After first row/last row (from 1 to 1000000) limit is reached:
[SS_DST tlkpDNBGlobal [41234]] Error: An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Reading from DTS buffer timed out.".


When trying to do a MaximumCommit = 1000000. Runs up to 1000000 OK then slows down and then error.
[SS_DST tlkpDNBGlobal [41234]] Error: Unable to prepare the SSIS bulk insert for data insertion.

[DTS.Pipeline] Error: The PrimeOutput method on component "FF_SRC DNBGlobal" (6899) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

-
When attempting all in a single batch:
[OLE_DST tlkpDNBGlobal [57133]] Error: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "The transaction log for database 'tempdb' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Could not allocate space for object 'dbo.SORT temporary run storage: 156362715561984' in database 'tempdb' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.".

The commit size is most likely the problem. SQL Server will be storing those rows in memory until it is time to commit them. I suggest trying different commit sizes until you find one that is optimum.

You tried 1M? Try 500,000 rows - if that works, maybe try more, or if it fails try less. You should find a good value fairly quickly.

if this does not help, let us know more.

Donald Farmer

|||Thanks Donald. I set the MaxInsertCommitSize to 10K and it worked fine.|||

Hi, Donald

I have had the same problem, how / where do i set the commit size ?

Thanks

Agbeko

|||

It wasn't available until SP1. You need to have SP1 installed then you will get MaxInsertCommitSize as a property of SQL Server Destination.

Thanks

Sutha

|||

Thanks for pointing that out Sutha. Remember also that commit size has always been available on the OLEDB Destination with the SQL Server provider.

Donald

|||

Hi, Sutha,

I am new to SQL Server world. Dummy question:

refering to SP1, is it for Windows, SQL Server 2005, or Integration Service?

Thanks

Chuan

|||

They are talking about SP1 for SQL Server 2005

Rafael Salas

|||Thanks Rafael|||

Rafael Salas wrote:

They are talking about SP1 for SQL Server 2005

Rafael Salas

Which includes Integration Services by the way.

John, the only dumb questions are the ones that don't get asked.

-Jamie

Friday, March 9, 2012

Errors ACO0151E for tdp db backups

I have seen this error message for my tdp db backups job.
On my error text file, all the attemped backups were
completed, but there were 90 inactivated db log backup
messages. Why the log backup messages were shown in the
db backup error text file? Is Errors ACO0151E related to
these 90 inactivated log backups, while the SQL Server
Agent and tdplogerr text file showed the tdp db log
backups were completed successfully?
The first place to look for would be the SQL Error logs. Do they indicate
that the backup has been successful. You could also check in the
Application event logs. If the backup would run successfully to a local
disk, then it means that everything is ok as far as the "SQL Server" backup
is concerned.
Thanks,
Vikram Jayaram
Microsoft, SQL Server
This posting is provided "AS IS" with no warranties, and confers no rights.
Subscribe to MSDN & use http://msdn.microsoft.com/newsgroups.
|||The first place to look for would be the SQL Error logs. Do they indicate
that the backup has been successful. You could also check in the
Application event logs. If the backup would run successfully to a local
disk, then it means that everything is ok as far as the "SQL Server" backup
is concerned.
Thanks,
Vikram Jayaram
Microsoft, SQL Server
This posting is provided "AS IS" with no warranties, and confers no rights.
Subscribe to MSDN & use http://msdn.microsoft.com/newsgroups.

Errors ACO0151E for tdp db backups

I have seen this error message for my tdp db backups job.
On my error text file, all the attemped backups were
completed, but there were 90 inactivated db log backup
messages. Why the log backup messages were shown in the
db backup error text file? Is Errors ACO0151E related to
these 90 inactivated log backups, while the SQL Server
Agent and tdplogerr text file showed the tdp db log
backups were completed successfully?The first place to look for would be the SQL Error logs. Do they indicate
that the backup has been successful. You could also check in the
Application event logs. If the backup would run successfully to a local
disk, then it means that everything is ok as far as the "SQL Server" backup
is concerned.
Thanks,
Vikram Jayaram
Microsoft, SQL Server
This posting is provided "AS IS" with no warranties, and confers no rights.
Subscribe to MSDN & use http://msdn.microsoft.com/newsgroups.

Errors ACO0151E for tdp db backups

I have seen this error message for my tdp db backups job.
On my error text file, all the attemped backups were
completed, but there were 90 inactivated db log backup
messages. Why the log backup messages were shown in the
db backup error text file? Is Errors ACO0151E related to
these 90 inactivated log backups, while the SQL Server
Agent and tdplogerr text file showed the tdp db log
backups were completed successfully?The first place to look for would be the SQL Error logs. Do they indicate
that the backup has been successful. You could also check in the
Application event logs. If the backup would run successfully to a local
disk, then it means that everything is ok as far as the "SQL Server" backup
is concerned.
Thanks,
Vikram Jayaram
Microsoft, SQL Server
This posting is provided "AS IS" with no warranties, and confers no rights.
Subscribe to MSDN & use http://msdn.microsoft.com/newsgroups.

Wednesday, March 7, 2012

ErrorLog size limit

Hi.

I have a service on a server which contains Sql Server Express.

this service adds documents to the database to allow full text search features.

From some reason, the index crashes and than it writes to the ErrorLog file, the errorlog jumps to 20GB(!) causing the server to crash.

How can :

1. limit the size of the error log.

2. find out why the log jumps to this size (I can't open the error log file ofcourse...)

Thanks!

hi,

if you are refererring to transaction log you can you the alter database command

GO

ALTER DATABASE [Northwind] MODIFY FILE ( NAME = N'Northwind_log', MAXSIZE = 1024000KB )

GO

Regards,

joey

Error2Text is not a member of System.Data.SqlClient.SqlParameter

Error 2 'Text' is not a member of 'System.Data.SqlClient.SqlParameter'.

I think this error has something to do with the lack of an Import command, anyone point me in the right direction?

sorry, I've decided to upload all the code of default.aspx.vb...I'm trying to make a mailing list sign up. So member can add their name and email for a monthly newsletter. I'm getting three errors, the one posted above on each of the three text boxes. what do I do?

Thanks very much

Partial Class _Default
Inherits System.Web.UI.Page

Protected Sub Button1_Click(ByVal sender As Object, ByVal e As System.EventArgs) Handles AddButton.Click

Dim connect As String
connect = "data source='.\SQLEXPRESS;AttachDbFilename=|DataDirectory|\Database.mdf;Integrated Security=True;User Instance=True'"
Dim sqlconnect As New System.Data.SqlClient.SqlConnection(connect)
Dim command As String

command = "insert into DataForm values(@.firstname,@.lastname,@.email)"
Dim sqlcommand As New System.Data.SqlClient.SqlCommand

sqlcommand.CommandText = command
sqlcommand.Connection = sqlconnect

Dim firstname As New System.Data.SqlClient.SqlParameter()

firstname.ParameterName = "@.firstname"
firstname.Value = firstname.Text
firstname.SqlDbType = System.Data.SqlDbType.VarChar
sqlcommand.Parameters.Add(firstname)

Dim lastname As New System.Data.SqlClient.SqlParameter()

lastname.ParameterName = "@.lastname"
lastname.Value = lastname.Text
lastname.SqlDbType = System.Data.SqlDbType.VarChar
sqlcommand.Parameters.Add(lastname)

Dim email As New System.Data.SqlClient.SqlParameter()

email.ParameterName = "@.email"
email.Value = email.Text
email.SqlDbType = System.Data.SqlDbType.VarChar
sqlcommand.Parameters.Add(email)

sqlconnect.Open()
sqlcommand.ExecuteNonQuery()
sqlconnect.Close()

End Sub

Protected Sub SqlDataSource1_Selecting(ByVal sender As Object, ByVal e As System.Web.UI.WebControls.SqlDataSourceSelectingEventArgs) Handles SqlDataSource1.Selecting

End Sub

Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load

End Sub

Protected Sub AddButton_Click(ByVal sender As Object, ByVal e As System.EventArgs) Handles AddButton.Click

End Sub
End Class

|||

Hi,

I think you need to start naming your objects a bit better, seems to me that both your textbox and your SqlParameter have the same name(e.g. firstname, lastname)

I'm pretty sure that's the source of your problems. Try renaming either the textbox control or the SqlParameter (or both) to like txtFirstname and parFirstname.

Good Luck,

Tim Lensen

|||

thanks for the reply I'll give it a go and get back shortly!

Thanks!

-Ian

|||thanks that solved my problem!

Sunday, February 19, 2012

Error: The data types text and text are incompatible in the equal to operator.

I get this error:

The data types text and text are incompatible in the equal to operator.

when trying to execute this query

SELECT id FROM users WHERE username=@.userName

Any Ideas?We'll need to know te data type of the username column, as well as the SqlDbType you are using for the @.userName parameter.|||I'm using text and text.|||Can't do that. There are a number of things that you can't do with blob data types, and that's one of them. Considering the name of the field is username, I'm guessing that you should be using varchar instead.

Friday, February 17, 2012

Error: The backup of the file or filegroup "sysft_XXXX_FT" is not permitted because it

Hi There,

I am trying to do a full back up of a user database which has full text indexing on it. I use the maintenance plan wizard to set up a full backup. For some reason, I get this error.

=================

Executing the query "BACKUP DATABASE [XXXX] TO [XXXX]
WITH NOFORMAT, NOINIT, NAME = N'XXXX_backup_20060819040020', SKIP, REWIND, NOUNLOAD, STATS = 10
" failed with the following error:
"The backup of the file or filegroup "sysft_XXXX_FT" is not permitted because it is not online.
BACKUP can be performed by using the FILEGROUP or FILE clauses
to restrict the selection to include only online data.
BACKUP DATABASE is terminating abnormally.".
Possible failure reasons: Problems with the query,
"ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.

===============

Can anybody please suggest me what I should do?

Thanks.

Is the database accessable and online ? i.e. : not suspect ?|||I get the same error message. I have a total of 10 databases & two of them give me the same error message when I try to backup using the maintenance utility. The d/b are used by the SharePoint Portal. If anyone has found a solution, please send my way. thank you|||

The immediate cause of the error is obvious: The fulltext catalog is offline.

What needs some work is figuring out why the catalog went offline and wnat to do about it.

The simplest courses of action are to either drop the catalog if it's not needed, or to rebuild it if it is needed. As soon as the rebuid starts, you should be able to successfully back up the database. You don't need to wait for the rebuild to complete.

|||

This can also happen for...
64bit 2005 Standard Cluster w/ SP1 (my configuration)
I found out that it was improperly reporting the "offline" full-text index. By improperly, I mean, that the error was reporting our current "Good" index when it should have been reporting one that had a different name that was in fact "offline".

Our problem:
You should look to see if you have any "old" indices from restores. Ours happened during a backup and restore from a single 32bit box moving to 64bit cluster. Took forever to find out how to get rid of the old index...Go to SQL Server Management Studio > <machine/instance> > <databasename> > Storage > Full Text Catalogs

...and delete the offending full text catalog definitions there. Mine didn't really exist (defined on a "d:\" drive that isn't even present in the new clustered system). You CANNOT make ad hoc queries to the db sys tables anymore in 2005. This will allow you to remove "hanging" full-text index definitions from the sys tables...allowing the backup to successfully kick off.

...our site Used Textbooks

|||Our error was:
"The backup of the file or filegroup "sysft_Keywords" is not permitted because it is not online."[SQLSTATE 42000] (Error 3007) BACKUP DATABASE is terminating abnormally. [SQLSTATE 42000] (Error 3013). The step failed. Sql Severity 16; Sql Message ID 3013.
This database was restored/converted from SQL2000. We did initially use SQL Full Text to perform searches before developing our own contextual tool.
Solution: In SQL Studio,Delete the items under the Full Text Catalogs==> Storage. This is the same ask12math's solution.|||

I'm thinking that solution is only relevant to the fully featured SQL Management Studio... is there any way to do this either in Studio Management Express or interactively? I've been using SQL Express (advanced version) for about a month now; last night's backup went fine, then today I get the "not permitted" error cited above. And only in this one database... the 15 or so others continue to backup normally.

BTW: I too restored these databases from SQL 2000 backups, and this one database had one column of one table marked for Full Text indexing at the time of the restore. I really don't need it, but apparently I didn't install the full text feature, so I can't remove it because I get an error that says the feature is not installed.

Any manual way around this problem? Thanks in advance.

Gordon

|||Are you sure that Express supports Full Text? ...and no as of about 6 months ago there was no way to remove full text index definitions manually (per Microsoft). That may have changed but if it has I am not aware of it. Good luck. My suggestion is to reimport the backup from 2000 after removing the definition (in 2000). Which probably isn't an option now because there is probably new data right? Let us know if you find anything.|||

k12math: Yes, you are right about the reimport... no longer an option because of a month of use. BUT: I did find a solution after several hours, so I'll share it with you!

1) Put the database into single user mode.

2) Start to detach the database. But before clicking OK for the detach, UNCHECK the box that says "Keep..." (the rest of the line is not immediately visible... what it says is, Keep fulltext catalogs. Get rid of that). Now OK the detach.

3) Reattach the database, and the backup works. Just like that.

Good luck everyone!

Gordon

|||This worked well for me. Good job Gordonh.

Neil.

Error: The backup of the file or filegroup "sysft_XXXX_FT" is not permitted because it

Hi There,

I am trying to do a full back up of a user database which has full text indexing on it. I use the maintenance plan wizard to set up a full backup. For some reason, I get this error.

=================

Executing the query "BACKUP DATABASE [XXXX] TO [XXXX]
WITH NOFORMAT, NOINIT, NAME = N'XXXX_backup_20060819040020', SKIP, REWIND, NOUNLOAD, STATS = 10
" failed with the following error:
"The backup of the file or filegroup "sysft_XXXX_FT" is not permitted because it is not online.
BACKUP can be performed by using the FILEGROUP or FILE clauses
to restrict the selection to include only online data.
BACKUP DATABASE is terminating abnormally.".
Possible failure reasons: Problems with the query,
"ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.

===============

Can anybody please suggest me what I should do?

Thanks.

Is the database accessable and online ? i.e. : not suspect ?|||I get the same error message. I have a total of 10 databases & two of them give me the same error message when I try to backup using the maintenance utility. The d/b are used by the SharePoint Portal. If anyone has found a solution, please send my way. thank you|||

The immediate cause of the error is obvious: The fulltext catalog is offline.

What needs some work is figuring out why the catalog went offline and wnat to do about it.

The simplest courses of action are to either drop the catalog if it's not needed, or to rebuild it if it is needed. As soon as the rebuid starts, you should be able to successfully back up the database. You don't need to wait for the rebuild to complete.

|||

This can also happen for...
64bit 2005 Standard Cluster w/ SP1 (my configuration)
I found out that it was improperly reporting the "offline" full-text index. By improperly, I mean, that the error was reporting our current "Good" index when it should have been reporting one that had a different name that was in fact "offline".

Our problem:
You should look to see if you have any "old" indices from restores. Ours happened during a backup and restore from a single 32bit box moving to 64bit cluster. Took forever to find out how to get rid of the old index...Go to SQL Server Management Studio > <machine/instance> > <databasename> > Storage > Full Text Catalogs

...and delete the offending full text catalog definitions there. Mine didn't really exist (defined on a "d:\" drive that isn't even present in the new clustered system). You CANNOT make ad hoc queries to the db sys tables anymore in 2005. This will allow you to remove "hanging" full-text index definitions from the sys tables...allowing the backup to successfully kick off.

...our site Used Textbooks

|||Our error was:
"The backup of the file or filegroup "sysft_Keywords" is not permitted because it is not online."[SQLSTATE 42000] (Error 3007) BACKUP DATABASE is terminating abnormally. [SQLSTATE 42000] (Error 3013). The step failed. Sql Severity 16; Sql Message ID 3013.
This database was restored/converted from SQL2000. We did initially use SQL Full Text to perform searches before developing our own contextual tool.
Solution: In SQL Studio, Delete the items under the Full Text Catalogs ==> Storage. This is the same as k12math's solution.|||

I'm thinking that solution is only relevant to the fully featured SQL Management Studio... is there any way to do this either in Studio Management Express or interactively? I've been using SQL Express (advanced version) for about a month now; last night's backup went fine, then today I get the "not permitted" error cited above. And only in this one database... the 15 or so others continue to backup normally.

BTW: I too restored these databases from SQL 2000 backups, and this one database had one column of one table marked for Full Text indexing at the time of the restore. I really don't need it, but apparently I didn't install the full text feature, so I can't remove it because I get an error that says the feature is not installed.

Any manual way around this problem? Thanks in advance.

Gordon

|||Are you sure that Express supports Full Text? ...and no as of about 6 months ago there was no way to remove full text index definitions manually (per Microsoft). That may have changed but if it has I am not aware of it. Good luck. My suggestion is to reimport the backup from 2000 after removing the definition (in 2000). Which probably isn't an option now because there is probably new data right? Let us know if you find anything.|||

k12math: Yes, you are right about the reimport... no longer an option because of a month of use. BUT: I did find a solution after several hours, so I'll share it with you!

1) Put the database into single user mode.

2) Start to detach the database. But before clicking OK for the detach, UNCHECK the box that says "Keep..." (the rest of the line is not immediately visible... what it says is, Keep fulltext catalogs. Get rid of that). Now OK the detach.

3) Reattach the database, and the backup works. Just like that.

Good luck everyone!

Gordon

|||This worked well for me. Good job

Gordonh.

Neil.

Error: The backup of the file or filegroup "sysft_XXXX_FT" is not permitted because it

Hi There,

I am trying to do a full back up of a user database which has full text indexing on it. I use the maintenance plan wizard to set up a full backup. For some reason, I get this error.

=================

Executing the query "BACKUP DATABASE [XXXX] TO [XXXX]
WITH NOFORMAT, NOINIT, NAME = N'XXXX_backup_20060819040020', SKIP, REWIND, NOUNLOAD, STATS = 10
" failed with the following error:
"The backup of the file or filegroup "sysft_XXXX_FT" is not permitted because it is not online.
BACKUP can be performed by using the FILEGROUP or FILE clauses
to restrict the selection to include only online data.
BACKUP DATABASE is terminating abnormally.".
Possible failure reasons: Problems with the query,
"ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.

===============

Can anybody please suggest me what I should do?

Thanks.

Is the database accessable and online ? i.e. : not suspect ?|||I get the same error message. I have a total of 10 databases & two of them give me the same error message when I try to backup using the maintenance utility. The d/b are used by the SharePoint Portal. If anyone has found a solution, please send my way. thank you|||

The immediate cause of the error is obvious: The fulltext catalog is offline.

What needs some work is figuring out why the catalog went offline and wnat to do about it.

The simplest courses of action are to either drop the catalog if it's not needed, or to rebuild it if it is needed. As soon as the rebuid starts, you should be able to successfully back up the database. You don't need to wait for the rebuild to complete.

|||

This can also happen for...
64bit 2005 Standard Cluster w/ SP1 (my configuration)
I found out that it was improperly reporting the "offline" full-text index. By improperly, I mean, that the error was reporting our current "Good" index when it should have been reporting one that had a different name that was in fact "offline".

Our problem:
You should look to see if you have any "old" indices from restores. Ours happened during a backup and restore from a single 32bit box moving to 64bit cluster. Took forever to find out how to get rid of the old index...Go to SQL Server Management Studio > <machine/instance> > <databasename> > Storage > Full Text Catalogs

...and delete the offending full text catalog definitions there. Mine didn't really exist (defined on a "d:\" drive that isn't even present in the new clustered system). You CANNOT make ad hoc queries to the db sys tables anymore in 2005. This will allow you to remove "hanging" full-text index definitions from the sys tables...allowing the backup to successfully kick off.

...our site Used Textbooks

|||Our error was:
"The backup of the file or filegroup "sysft_Keywords" is not permitted because it is not online."[SQLSTATE 42000] (Error 3007) BACKUP DATABASE is terminating abnormally. [SQLSTATE 42000] (Error 3013). The step failed. Sql Severity 16; Sql Message ID 3013.
This database was restored/converted from SQL2000. We did initially use SQL Full Text to perform searches before developing our own contextual tool.
Solution: In SQL Studio, Delete the items under the Full Text Catalogs ==> Storage. This is the same as k12math's solution.|||

I'm thinking that solution is only relevant to the fully featured SQL Management Studio... is there any way to do this either in Studio Management Express or interactively? I've been using SQL Express (advanced version) for about a month now; last night's backup went fine, then today I get the "not permitted" error cited above. And only in this one database... the 15 or so others continue to backup normally.

BTW: I too restored these databases from SQL 2000 backups, and this one database had one column of one table marked for Full Text indexing at the time of the restore. I really don't need it, but apparently I didn't install the full text feature, so I can't remove it because I get an error that says the feature is not installed.

Any manual way around this problem? Thanks in advance.

Gordon

|||Are you sure that Express supports Full Text? ...and no as of about 6 months ago there was no way to remove full text index definitions manually (per Microsoft). That may have changed but if it has I am not aware of it. Good luck. My suggestion is to reimport the backup from 2000 after removing the definition (in 2000). Which probably isn't an option now because there is probably new data right? Let us know if you find anything.|||

k12math: Yes, you are right about the reimport... no longer an option because of a month of use. BUT: I did find a solution after several hours, so I'll share it with you!

1) Put the database into single user mode.

2) Start to detach the database. But before clicking OK for the detach, UNCHECK the box that says "Keep..." (the rest of the line is not immediately visible... what it says is, Keep fulltext catalogs. Get rid of that). Now OK the detach.

3) Reattach the database, and the backup works. Just like that.

Good luck everyone!

Gordon

|||This worked well for me. Good job

Gordonh.

Neil.

Error: The backup of the file or filegroup "sysft_XXXX_FT" is not permitted because it

Hi There,

I am trying to do a full back up of a user database which has full text indexing on it. I use the maintenance plan wizard to set up a full backup. For some reason, I get this error.

=================

Executing the query "BACKUP DATABASE [XXXX] TO [XXXX]
WITH NOFORMAT, NOINIT, NAME = N'XXXX_backup_20060819040020', SKIP, REWIND, NOUNLOAD, STATS = 10
" failed with the following error:
"The backup of the file or filegroup "sysft_XXXX_FT" is not permitted because it is not online.
BACKUP can be performed by using the FILEGROUP or FILE clauses
to restrict the selection to include only online data.
BACKUP DATABASE is terminating abnormally.".
Possible failure reasons: Problems with the query,
"ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.

===============

Can anybody please suggest me what I should do?

Thanks.

Is the database accessable and online ? i.e. : not suspect ?|||I get the same error message. I have a total of 10 databases & two of them give me the same error message when I try to backup using the maintenance utility. The d/b are used by the SharePoint Portal. If anyone has found a solution, please send my way. thank you|||

The immediate cause of the error is obvious: The fulltext catalog is offline.

What needs some work is figuring out why the catalog went offline and wnat to do about it.

The simplest courses of action are to either drop the catalog if it's not needed, or to rebuild it if it is needed. As soon as the rebuid starts, you should be able to successfully back up the database. You don't need to wait for the rebuild to complete.

|||

This can also happen for...
64bit 2005 Standard Cluster w/ SP1 (my configuration)
I found out that it was improperly reporting the "offline" full-text index. By improperly, I mean, that the error was reporting our current "Good" index when it should have been reporting one that had a different name that was in fact "offline".

Our problem:
You should look to see if you have any "old" indices from restores. Ours happened during a backup and restore from a single 32bit box moving to 64bit cluster. Took forever to find out how to get rid of the old index...Go to SQL Server Management Studio > <machine/instance> > <databasename> > Storage > Full Text Catalogs

...and delete the offending full text catalog definitions there. Mine didn't really exist (defined on a "d:\" drive that isn't even present in the new clustered system). You CANNOT make ad hoc queries to the db sys tables anymore in 2005. This will allow you to remove "hanging" full-text index definitions from the sys tables...allowing the backup to successfully kick off.

...our site Used Textbooks

|||Our error was:
"The backup of the file or filegroup "sysft_Keywords" is not permitted because it is not online."[SQLSTATE 42000] (Error 3007) BACKUP DATABASE is terminating abnormally. [SQLSTATE 42000] (Error 3013). The step failed. Sql Severity 16; Sql Message ID 3013.
This database was restored/converted from SQL2000. We did initially use SQL Full Text to perform searches before developing our own contextual tool.
Solution: In SQL Studio, Delete the items under the Full Text Catalogs ==> Storage. This is the same as k12math's solution.|||

I'm thinking that solution is only relevant to the fully featured SQL Management Studio... is there any way to do this either in Studio Management Express or interactively? I've been using SQL Express (advanced version) for about a month now; last night's backup went fine, then today I get the "not permitted" error cited above. And only in this one database... the 15 or so others continue to backup normally.

BTW: I too restored these databases from SQL 2000 backups, and this one database had one column of one table marked for Full Text indexing at the time of the restore. I really don't need it, but apparently I didn't install the full text feature, so I can't remove it because I get an error that says the feature is not installed.

Any manual way around this problem? Thanks in advance.

Gordon

|||Are you sure that Express supports Full Text? ...and no as of about 6 months ago there was no way to remove full text index definitions manually (per Microsoft). That may have changed but if it has I am not aware of it. Good luck. My suggestion is to reimport the backup from 2000 after removing the definition (in 2000). Which probably isn't an option now because there is probably new data right? Let us know if you find anything.|||

k12math: Yes, you are right about the reimport... no longer an option because of a month of use. BUT: I did find a solution after several hours, so I'll share it with you!

1) Put the database into single user mode.

2) Start to detach the database. But before clicking OK for the detach, UNCHECK the box that says "Keep..." (the rest of the line is not immediately visible... what it says is, Keep fulltext catalogs. Get rid of that). Now OK the detach.

3) Reattach the database, and the backup works. Just like that.

Good luck everyone!

Gordon

|||This worked well for me. Good job

Gordonh.

Neil.

Error: The backup of the file or filegroup "sysft_XXXX_FT" is not permitted becaus

Hi There,

I am trying to do a full back up of a user database which has full text indexing on it. I use the maintenance plan wizard to set up a full backup. For some reason, I get this error.

=================

Executing the query "BACKUP DATABASE [XXXX] TO [XXXX]
WITH NOFORMAT, NOINIT, NAME = N'XXXX_backup_20060819040020', SKIP, REWIND, NOUNLOAD, STATS = 10
" failed with the following error:
"The backup of the file or filegroup "sysft_XXXX_FT" is not permitted because it is not online.
BACKUP can be performed by using the FILEGROUP or FILE clauses
to restrict the selection to include only online data.
BACKUP DATABASE is terminating abnormally.".
Possible failure reasons: Problems with the query,
"ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.

===============

Can anybody please suggest me what I should do?

Thanks.

Is the database accessable and online ? i.e. : not suspect ?|||I get the same error message. I have a total of 10 databases & two of them give me the same error message when I try to backup using the maintenance utility. The d/b are used by the SharePoint Portal. If anyone has found a solution, please send my way. thank you|||

The immediate cause of the error is obvious: The fulltext catalog is offline.

What needs some work is figuring out why the catalog went offline and wnat to do about it.

The simplest courses of action are to either drop the catalog if it's not needed, or to rebuild it if it is needed. As soon as the rebuid starts, you should be able to successfully back up the database. You don't need to wait for the rebuild to complete.

|||

This can also happen for...
64bit 2005 Standard Cluster w/ SP1 (my configuration)
I found out that it was improperly reporting the "offline" full-text index. By improperly, I mean, that the error was reporting our current "Good" index when it should have been reporting one that had a different name that was in fact "offline".

Our problem:
You should look to see if you have any "old" indices from restores. Ours happened during a backup and restore from a single 32bit box moving to 64bit cluster. Took forever to find out how to get rid of the old index...Go to SQL Server Management Studio > <machine/instance> > <databasename> > Storage > Full Text Catalogs

...and delete the offending full text catalog definitions there. Mine didn't really exist (defined on a "d:\" drive that isn't even present in the new clustered system). You CANNOT make ad hoc queries to the db sys tables anymore in 2005. This will allow you to remove "hanging" full-text index definitions from the sys tables...allowing the backup to successfully kick off.

...our site Used Textbooks

|||Our error was:
"The backup of the file or filegroup "sysft_Keywords" is not permitted because it is not online."[SQLSTATE 42000] (Error 3007) BACKUP DATABASE is terminating abnormally. [SQLSTATE 42000] (Error 3013). The step failed. Sql Severity 16; Sql Message ID 3013.
This database was restored/converted from SQL2000. We did initially use SQL Full Text to perform searches before developing our own contextual tool.
Solution: In SQL Studio, Delete the items under the Full Text Catalogs ==> Storage. This is the same as k12math's solution.|||

I'm thinking that solution is only relevant to the fully featured SQL Management Studio... is there any way to do this either in Studio Management Express or interactively? I've been using SQL Express (advanced version) for about a month now; last night's backup went fine, then today I get the "not permitted" error cited above. And only in this one database... the 15 or so others continue to backup normally.

BTW: I too restored these databases from SQL 2000 backups, and this one database had one column of one table marked for Full Text indexing at the time of the restore. I really don't need it, but apparently I didn't install the full text feature, so I can't remove it because I get an error that says the feature is not installed.

Any manual way around this problem? Thanks in advance.

Gordon

|||Are you sure that Express supports Full Text? ...and no as of about 6 months ago there was no way to remove full text index definitions manually (per Microsoft). That may have changed but if it has I am not aware of it. Good luck. My suggestion is to reimport the backup from 2000 after removing the definition (in 2000). Which probably isn't an option now because there is probably new data right? Let us know if you find anything.|||

k12math: Yes, you are right about the reimport... no longer an option because of a month of use. BUT: I did find a solution after several hours, so I'll share it with you!

1) Put the database into single user mode.

2) Start to detach the database. But before clicking OK for the detach, UNCHECK the box that says "Keep..." (the rest of the line is not immediately visible... what it says is, Keep fulltext catalogs. Get rid of that). Now OK the detach.

3) Reattach the database, and the backup works. Just like that.

Good luck everyone!

Gordon

|||This worked well for me. Good job

Gordonh.

Neil.

Error: The backup of the file or filegroup "sysft_XXXX_FT" is not permitted becaus

Hi There,

I am trying to do a full back up of a user database which has full text indexing on it. I use the maintenance plan wizard to set up a full backup. For some reason, I get this error.

=================

Executing the query "BACKUP DATABASE [XXXX] TO [XXXX]
WITH NOFORMAT, NOINIT, NAME = N'XXXX_backup_20060819040020', SKIP, REWIND, NOUNLOAD, STATS = 10
" failed with the following error:
"The backup of the file or filegroup "sysft_XXXX_FT" is not permitted because it is not online.
BACKUP can be performed by using the FILEGROUP or FILE clauses
to restrict the selection to include only online data.
BACKUP DATABASE is terminating abnormally.".
Possible failure reasons: Problems with the query,
"ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.

===============

Can anybody please suggest me what I should do?

Thanks.

Is the database accessable and online ? i.e. : not suspect ?|||I get the same error message. I have a total of 10 databases & two of them give me the same error message when I try to backup using the maintenance utility. The d/b are used by the SharePoint Portal. If anyone has found a solution, please send my way. thank you|||

The immediate cause of the error is obvious: The fulltext catalog is offline.

What needs some work is figuring out why the catalog went offline and wnat to do about it.

The simplest courses of action are to either drop the catalog if it's not needed, or to rebuild it if it is needed. As soon as the rebuid starts, you should be able to successfully back up the database. You don't need to wait for the rebuild to complete.

|||

This can also happen for...
64bit 2005 Standard Cluster w/ SP1 (my configuration)
I found out that it was improperly reporting the "offline" full-text index. By improperly, I mean, that the error was reporting our current "Good" index when it should have been reporting one that had a different name that was in fact "offline".

Our problem:
You should look to see if you have any "old" indices from restores. Ours happened during a backup and restore from a single 32bit box moving to 64bit cluster. Took forever to find out how to get rid of the old index...Go to SQL Server Management Studio > <machine/instance> > <databasename> > Storage > Full Text Catalogs

...and delete the offending full text catalog definitions there. Mine didn't really exist (defined on a "d:\" drive that isn't even present in the new clustered system). You CANNOT make ad hoc queries to the db sys tables anymore in 2005. This will allow you to remove "hanging" full-text index definitions from the sys tables...allowing the backup to successfully kick off.

...our site Used Textbooks

|||Our error was:
"The backup of the file or filegroup "sysft_Keywords" is not permitted because it is not online."[SQLSTATE 42000] (Error 3007) BACKUP DATABASE is terminating abnormally. [SQLSTATE 42000] (Error 3013). The step failed. Sql Severity 16; Sql Message ID 3013.
This database was restored/converted from SQL2000. We did initially use SQL Full Text to perform searches before developing our own contextual tool.
Solution: In SQL Studio, Delete the items under the Full Text Catalogs ==> Storage. This is the same as k12math's solution.|||

I'm thinking that solution is only relevant to the fully featured SQL Management Studio... is there any way to do this either in Studio Management Express or interactively? I've been using SQL Express (advanced version) for about a month now; last night's backup went fine, then today I get the "not permitted" error cited above. And only in this one database... the 15 or so others continue to backup normally.

BTW: I too restored these databases from SQL 2000 backups, and this one database had one column of one table marked for Full Text indexing at the time of the restore. I really don't need it, but apparently I didn't install the full text feature, so I can't remove it because I get an error that says the feature is not installed.

Any manual way around this problem? Thanks in advance.

Gordon

|||Are you sure that Express supports Full Text? ...and no as of about 6 months ago there was no way to remove full text index definitions manually (per Microsoft). That may have changed but if it has I am not aware of it. Good luck. My suggestion is to reimport the backup from 2000 after removing the definition (in 2000). Which probably isn't an option now because there is probably new data right? Let us know if you find anything.|||

k12math: Yes, you are right about the reimport... no longer an option because of a month of use. BUT: I did find a solution after several hours, so I'll share it with you!

1) Put the database into single user mode.

2) Start to detach the database. But before clicking OK for the detach, UNCHECK the box that says "Keep..." (the rest of the line is not immediately visible... what it says is, Keep fulltext catalogs. Get rid of that). Now OK the detach.

3) Reattach the database, and the backup works. Just like that.

Good luck everyone!

Gordon

|||This worked well for me. Good job

Gordonh.

Neil.