Showing posts with label windows. Show all posts
Showing posts with label windows. Show all posts

Thursday, March 22, 2012

Create a file using a SQL DB Trigger

Is there a way to create a text file (such as a Windows Notepad file)
by using a trigger on a table? What I want to do is to send a row of
information to a table where the table: tblFileData has only one
column: txtOutput
I want to use the DB front end (MS Access) to send the text string to
the SQL backend, then have the SQL Server create a file to a path,
such as F:/myfiledate.txt that holds the text in txtOutput, then the
trigger deletes the row in tblFileData.
Can this be done easily?
Any help is appreciatedHi

I am not sure why you should want to do this. If the transaction is rolled
back then the file will have different information to the database table.
You may want to look at the stored procedure xp_cmdshell, such as
http://tinyurl.com/64azq. Use of the echo command will write the information
into a file select @.sql = echo ' + @.v + ' >
\\Myserver\MyShare\myfiledate.txt'

Note: The example does not take into account multiple rows in the inserted
table.

John

"Lauren Quantrell" <laurenquantrell@.hotmail.com> wrote in message
news:47e5bd72.0407131443.33a873bb@.posting.google.c om...
> Is there a way to create a text file (such as a Windows Notepad file)
> by using a trigger on a table? What I want to do is to send a row of
> information to a table where the table: tblFileData has only one
> column: txtOutput
> I want to use the DB front end (MS Access) to send the text string to
> the SQL backend, then have the SQL Server create a file to a path,
> such as F:/myfiledate.txt that holds the text in txtOutput, then the
> trigger deletes the row in tblFileData.
> Can this be done easily?
> Any help is appreciated|||John,
Thanks for your reply.
In this application, the client needs to be able to make the data
"portable" to users who they do not want to have access to the
application itself. The idea is to create a small text document of a
small amount of data, then post the data onto a server where the users
can download this file onto a portable device. The file may be updated
repeadly, but at the time the user downloads it, it will be the latest
file. The file itself contains a small list of customer orders and
instructions for the user. I know this is not the ideal way to do this
but it makes sense for this customer within their business practices.
I have avoided using triggers at all in this rather large application,
but this seems like the point where I need to make the plunge.
What I have is a table tblFileData and it has the columns: txtOutput,
txtEmployeeName, txtOrderNumber, txtOrderDateTime.
What I need is to create a trigger so that when a record is inserted
into the table, the trigger fires and creates a .txt file into a
mapped path on the server
f:\(txtOrderDate)"-"(txtOrderNumber)"-"(txtEmployeeName).txt with
txtOutput as the body of the file (example:
f:\20041225-987654-smithJohn.txt)
Then after creating the txt file, the trigger deletes the table row.

I have to admit I'm completely at ground zero when it comes to doing
this with a trigger so any help in constructing this is appreciated.
lq

"John Bell" <jbellnewsposts@.hotmail.com> wrote in message news:<GO7Jc.2822$Ec4.33496260@.news-text.cableinet.net>...
> Hi
> I am not sure why you should want to do this. If the transaction is rolled
> back then the file will have different information to the database table.
> You may want to look at the stored procedure xp_cmdshell, such as
> http://tinyurl.com/64azq. Use of the echo command will write the information
> into a file select @.sql = echo ' + @.v + ' >
> \\Myserver\MyShare\myfiledate.txt'
> Note: The example does not take into account multiple rows in the inserted
> table.
> John
> "Lauren Quantrell" <laurenquantrell@.hotmail.com> wrote in message
> news:47e5bd72.0407131443.33a873bb@.posting.google.c om...
> > Is there a way to create a text file (such as a Windows Notepad file)
> > by using a trigger on a table? What I want to do is to send a row of
> > information to a table where the table: tblFileData has only one
> > column: txtOutput
> > I want to use the DB front end (MS Access) to send the text string to
> > the SQL backend, then have the SQL Server create a file to a path,
> > such as F:/myfiledate.txt that holds the text in txtOutput, then the
> > trigger deletes the row in tblFileData.
> > Can this be done easily?
> > Any help is appreciated|||Hi

The example I posted would have given you just about everything you needed
to create these files although as I pointed out it is not necessarily going
to reflect the actual data! Usually this sort of task is carried out by a
scheduled (possibly DTS) job, although if you schedule it to run too often
it may be better to change it to an on demand process.

Books online will give you more details regarding DTS along with the
following site http://www.sqldts.com/default.aspx?200

Another alternative would be to provide a web interface.

John

"Lauren Quantrell" <laurenquantrell@.hotmail.com> wrote in message
news:47e5bd72.0407170613.22ad2ee0@.posting.google.c om...
> John,
> Thanks for your reply.
> In this application, the client needs to be able to make the data
> "portable" to users who they do not want to have access to the
> application itself. The idea is to create a small text document of a
> small amount of data, then post the data onto a server where the users
> can download this file onto a portable device. The file may be updated
> repeadly, but at the time the user downloads it, it will be the latest
> file. The file itself contains a small list of customer orders and
> instructions for the user. I know this is not the ideal way to do this
> but it makes sense for this customer within their business practices.
> I have avoided using triggers at all in this rather large application,
> but this seems like the point where I need to make the plunge.
> What I have is a table tblFileData and it has the columns: txtOutput,
> txtEmployeeName, txtOrderNumber, txtOrderDateTime.
> What I need is to create a trigger so that when a record is inserted
> into the table, the trigger fires and creates a .txt file into a
> mapped path on the server
> f:\(txtOrderDate)"-"(txtOrderNumber)"-"(txtEmployeeName).txt with
> txtOutput as the body of the file (example:
> f:\20041225-987654-smithJohn.txt)
> Then after creating the txt file, the trigger deletes the table row.
> I have to admit I'm completely at ground zero when it comes to doing
> this with a trigger so any help in constructing this is appreciated.
> lq
>
> "John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:<GO7Jc.2822$Ec4.33496260@.news-text.cableinet.net>...
> > Hi
> > I am not sure why you should want to do this. If the transaction is
rolled
> > back then the file will have different information to the database
table.
> > You may want to look at the stored procedure xp_cmdshell, such as
> > http://tinyurl.com/64azq. Use of the echo command will write the
information
> > into a file select @.sql = echo ' + @.v + ' >
> > \\Myserver\MyShare\myfiledate.txt'
> > Note: The example does not take into account multiple rows in the
inserted
> > table.
> > John
> > "Lauren Quantrell" <laurenquantrell@.hotmail.com> wrote in message
> > news:47e5bd72.0407131443.33a873bb@.posting.google.c om...
> > > Is there a way to create a text file (such as a Windows Notepad file)
> > > by using a trigger on a table? What I want to do is to send a row of
> > > information to a table where the table: tblFileData has only one
> > > column: txtOutput
> > > I want to use the DB front end (MS Access) to send the text string to
> > > the SQL backend, then have the SQL Server create a file to a path,
> > > such as F:/myfiledate.txt that holds the text in txtOutput, then the
> > > trigger deletes the row in tblFileData.
> > > Can this be done easily?
> > > Any help is appreciated|||Lauren Quantrell (laurenquantrell@.hotmail.com) writes:
> In this application, the client needs to be able to make the data
> "portable" to users who they do not want to have access to the
> application itself. The idea is to create a small text document of a
> small amount of data, then post the data onto a server where the users
> can download this file onto a portable device. The file may be updated
> repeadly, but at the time the user downloads it, it will be the latest
> file. The file itself contains a small list of customer orders and
> instructions for the user. I know this is not the ideal way to do this
> but it makes sense for this customer within their business practices.
> I have avoided using triggers at all in this rather large application,
> but this seems like the point where I need to make the plunge.
> What I have is a table tblFileData and it has the columns: txtOutput,
> txtEmployeeName, txtOrderNumber, txtOrderDateTime.
> What I need is to create a trigger so that when a record is inserted
> into the table, the trigger fires and creates a .txt file into a
> mapped path on the server
> f:\(txtOrderDate)"-"(txtOrderNumber)"-"(txtEmployeeName).txt with
> txtOutput as the body of the file (example:
> f:\20041225-987654-smithJohn.txt)
> Then after creating the txt file, the trigger deletes the table row.
> I have to admit I'm completely at ground zero when it comes to doing
> this with a trigger so any help in constructing this is appreciated.

If you don't know how to write this trigger, this is a good thing,
because that means that you will not do something which is really bad.

It could make sense to write to a text filr from a trigger for debugging
purposes, but for what you describe above, I go as far to say that this
is an unacceptable solution. There are two major problems:

1) The data you write to the file is uncommitted. If an error occurs
in the transaction, and it is rolled back, the user goes out on the
field with bogus data.

2) Forking out to write to files is slow, particularly if you will do
it for many rows - and you will have to iterate over them. This
could be a major bottleneck in your application.

The best would be to write a mini-app that accesses the databases and
gets data to the portable devices. If you absolutely must have files,
set up a job in SQL Agent which runs once a minute and gets rows and
updates the file with the most recent data.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||Erland,
As always, you've been very helpful and I appreciate your thoughts on
this.
In this installation, the design of the output is dictated by the
client because of other apps that will use this text file. It's a
first generation attempt on their part to deal with data that they
don't want the users to link to in any way.
So I'm stuck with it.
Also, they don't want users to have access to the path f:\ otherwise
I'd just do the whole thing in VBA with the Access front end.
My concept of this is that the trigger will fire on the inserting of a
new record in my table tblFileData - the trigger will create the file
and then the trigger will delete the row. It will not have to loop
through rows and there are only four columns in the table: txtOutput,
txtEmployeeName, txtOrderNumber, txtOrderDateTime.
The output needs to be:
f:\(txtOrderDate)"-"(txtOrderNumber)"-"(txtEmployeeName).txt with
txtOutput as the body of the file (example:
f:\20041225-987654-smithJohn.txt)
The front end application handles the inserting of a row in
thbFileData every time a change is made to one form, another app (on I
don't have control over) overwrites any previous files, so the output
file will contain the most recent.

I am totally under the gun on this one. Is this trigger easy to
construct? I have never contsructed one and I'm at zero hour. Ouch.

lq

Erland Sommarskog <esquel@.sommarskog.se> wrote in message news:<Xns952B28401363Yazorman@.127.0.0.1>...
> Lauren Quantrell (laurenquantrell@.hotmail.com) writes:
> > In this application, the client needs to be able to make the data
> > "portable" to users who they do not want to have access to the
> > application itself. The idea is to create a small text document of a
> > small amount of data, then post the data onto a server where the users
> > can download this file onto a portable device. The file may be updated
> > repeadly, but at the time the user downloads it, it will be the latest
> > file. The file itself contains a small list of customer orders and
> > instructions for the user. I know this is not the ideal way to do this
> > but it makes sense for this customer within their business practices.
> > I have avoided using triggers at all in this rather large application,
> > but this seems like the point where I need to make the plunge.
> > What I have is a table tblFileData and it has the columns: txtOutput,
> > txtEmployeeName, txtOrderNumber, txtOrderDateTime.
> > What I need is to create a trigger so that when a record is inserted
> > into the table, the trigger fires and creates a .txt file into a
> > mapped path on the server
> > f:\(txtOrderDate)"-"(txtOrderNumber)"-"(txtEmployeeName).txt with
> > txtOutput as the body of the file (example:
> > f:\20041225-987654-smithJohn.txt)
> > Then after creating the txt file, the trigger deletes the table row.
> > I have to admit I'm completely at ground zero when it comes to doing
> > this with a trigger so any help in constructing this is appreciated.
> If you don't know how to write this trigger, this is a good thing,
> because that means that you will not do something which is really bad.
> It could make sense to write to a text filr from a trigger for debugging
> purposes, but for what you describe above, I go as far to say that this
> is an unacceptable solution. There are two major problems:
> 1) The data you write to the file is uncommitted. If an error occurs
> in the transaction, and it is rolled back, the user goes out on the
> field with bogus data.
> 2) Forking out to write to files is slow, particularly if you will do
> it for many rows - and you will have to iterate over them. This
> could be a major bottleneck in your application.
> The best would be to write a mini-app that accesses the databases and
> gets data to the portable devices. If you absolutely must have files,
> set up a job in SQL Agent which runs once a minute and gets rows and
> updates the file with the most recent data.|||Lauren Quantrell (laurenquantrell@.hotmail.com) writes:
> In this installation, the design of the output is dictated by the
> client because of other apps that will use this text file. It's a
> first generation attempt on their part to deal with data that they
> don't want the users to link to in any way.
> So I'm stuck with it.
> Also, they don't want users to have access to the path f:\ otherwise
> I'd just do the whole thing in VBA with the Access front end.
> My concept of this is that the trigger will fire on the inserting of a
> new record in my table tblFileData - the trigger will create the file
> and then the trigger will delete the row.

Huh? So the row is not to be persisted? In that case, you could use
an INSTEAD OF trigger. An INSTEAD OF triggers sets instead of the command,
which means that if you don't redo the command in the trigger, the command
will not be carried out. The main target for INSTEAD OF triggers are views,
so a trigger in that case would divert data to the appropriate rows. But
you could use it for anything.

> It will not have to loop through rows

Since a trigger fires once per statement, and thus can cover many rows
your code must be able to handle multi-row inserts.

> I am totally under the gun on this one. Is this trigger easy to
> construct? I have never contsructed one and I'm at zero hour. Ouch.

It's too easy to construct. In a trigger you have the virtual tables
"inserted" and "deleted" to play with. (In an INSERT trigger, only
"inserted".) These tables holds the before- ("deleted") and
after-image ("inserted") of the rows affected rows. Then you can use
xp_cmdshell to execute DOS commands, including writing to a file.

The catch here, is that you need to grant users access to xp_cmdshell,
which is in master. But once you have done this, a user that manage to
find a query tool can log in and run whatever commands he like with
xp_cmdshell. That's why I say it too easy.

One way to address this is write a wrapper stored procedure that you
place in master which accepts the arguments. This wrapper would then
call xp_cmdshell, and you would grant access to the wrapper. You would
have to enable cross-database ownership chaining for this to work.
Which again is a security consideration to think twice over.

Yet another alternative is to write your own extended stored procedure
that writes to the file, but this requires C programming skills.

Maybe you should rethink and find another solution, after all.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||Lauren Quantrell (laurenquantrell@.hotmail.com) writes:
> The only problem is, I need to create a file that only contains text
> from one of the columns, a column names OutputText.
> DO you know how I can do this?

You will need to use a format file with your BCP command. The format
file for this particular case would look like this:

8.0
1 SQLCHAR 0 0 "\r\n" ? "" ""

For the question mark replace the number of the column you want to export
to the file. (Numbers start at 1.)

The last two columns that I've just set as "" are the column name
(informational only to BCP) and the collation for the column in the
file.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||John,
OK, I trashed the isea of a trigger and have adopted your suggection
of using xp_cmdshell from a stored procedure. Whenever the data is
inserted into the table in the SP, the same SP calls xp_cmdshell.

This is what I have done...
I have created a view named vOutput that shows one column in a table.
I insert a row into the table and then I'm using this code to create a
file with the text in the single row.

This code works fine when I'm signed on with an account that has
server admin rights, however it fails when I signin with an account
that does not have server admin rights.

Can you shed some light on solving this?

Alter PROCEDURE OutputOrders
@.FileName nvarchar(50)

AS
set nocount on

DECLARE @.ReturnCode int
DECLARE @.ExportCommand varchar(255)

SET @.ExportCommand =
'BCP myServerName.dbo.vOutput out "c:\output\order files\' +
@.TemplateFileName +
'" -T -c -S ' + @.@.SERVERNAME
EXEC @.ReturnCode = master.dbo.xp_cmdshell @.ExportCommand

I have granted execute permission to xp_cmdshell for both users and on
the view.

"John Bell" <jbellnewsposts@.hotmail.com> wrote in message news:<GO7Jc.2822$Ec4.33496260@.news-text.cableinet.net>...
> Hi
> I am not sure why you should want to do this. If the transaction is rolled
> back then the file will have different information to the database table.
> You may want to look at the stored procedure xp_cmdshell, such as
> http://tinyurl.com/64azq. Use of the echo command will write the information
> into a file select @.sql = echo ' + @.v + ' >
> \\Myserver\MyShare\myfiledate.txt'
> Note: The example does not take into account multiple rows in the inserted
> table.
> John
> "Lauren Quantrell" <laurenquantrell@.hotmail.com> wrote in message
> news:47e5bd72.0407131443.33a873bb@.posting.google.c om...
> > Is there a way to create a text file (such as a Windows Notepad file)
> > by using a trigger on a table? What I want to do is to send a row of
> > information to a table where the table: tblFileData has only one
> > column: txtOutput
> > I want to use the DB front end (MS Access) to send the text string to
> > the SQL backend, then have the SQL Server create a file to a path,
> > such as F:/myfiledate.txt that holds the text in txtOutput, then the
> > trigger deletes the row in tblFileData.
> > Can this be done easily?
> > Any help is appreciated|||Hi

This is direct from books online. I assume that it is a permissions problem
on xp_cmdshell and not on the directory itself:
"By default, only members of the sysadmin fixed server role can execute this
extended stored procedure. You may, however, grant other users permission to
execute this stored procedure.

When xp_cmdshell is invoked by a user who is a member of the sysadmin fixed
server role, xp_cmdshell will be executed under the security context in
which the SQL Server service is running. When the user is not a member of
the sysadmin group, xp_cmdshell will impersonate the SQL Server Agent proxy
account, which is specified using xp_sqlagent_proxy_account. If the proxy
account is not available, xp_cmdshell will fail. This is true only for
Microsoft Windows NT 4.0 and Windows 2000. On Windows 9.x, there is no
impersonation and xp_cmdshell is always executed under the security context
of the Windows 9.x user who started SQL Server."

If this is not a xp_cmdshell permission problem then it could be that access
to the directory for the SQL Server Agent proxy account is not valid or the
user account that started SQL Server for windows 9.x.

As debug option you may want to execute a "dir C:\*" command

John

"Lauren Quantrell" <laurenquantrell@.hotmail.com> wrote in message
news:47e5bd72.0407230525.2fc6209b@.posting.google.c om...
> John,
> OK, I trashed the isea of a trigger and have adopted your suggection
> of using xp_cmdshell from a stored procedure. Whenever the data is
> inserted into the table in the SP, the same SP calls xp_cmdshell.
> This is what I have done...
> I have created a view named vOutput that shows one column in a table.
> I insert a row into the table and then I'm using this code to create a
> file with the text in the single row.
> This code works fine when I'm signed on with an account that has
> server admin rights, however it fails when I signin with an account
> that does not have server admin rights.
> Can you shed some light on solving this?
>
> Alter PROCEDURE OutputOrders
> @.FileName nvarchar(50)
> AS
> set nocount on
> DECLARE @.ReturnCode int
> DECLARE @.ExportCommand varchar(255)
> SET @.ExportCommand =
> 'BCP myServerName.dbo.vOutput out "c:\output\order files\' +
> @.TemplateFileName +
> '" -T -c -S ' + @.@.SERVERNAME
> EXEC @.ReturnCode = master.dbo.xp_cmdshell @.ExportCommand
>
> I have granted execute permission to xp_cmdshell for both users and on
> the view.
>
>
> "John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:<GO7Jc.2822$Ec4.33496260@.news-text.cableinet.net>...
> > Hi
> > I am not sure why you should want to do this. If the transaction is
rolled
> > back then the file will have different information to the database
table.
> > You may want to look at the stored procedure xp_cmdshell, such as
> > http://tinyurl.com/64azq. Use of the echo command will write the
information
> > into a file select @.sql = echo ' + @.v + ' >
> > \\Myserver\MyShare\myfiledate.txt'
> > Note: The example does not take into account multiple rows in the
inserted
> > table.
> > John
> > "Lauren Quantrell" <laurenquantrell@.hotmail.com> wrote in message
> > news:47e5bd72.0407131443.33a873bb@.posting.google.c om...
> > > Is there a way to create a text file (such as a Windows Notepad file)
> > > by using a trigger on a table? What I want to do is to send a row of
> > > information to a table where the table: tblFileData has only one
> > > column: txtOutput
> > > I want to use the DB front end (MS Access) to send the text string to
> > > the SQL backend, then have the SQL Server create a file to a path,
> > > such as F:/myfiledate.txt that holds the text in txtOutput, then the
> > > trigger deletes the row in tblFileData.
> > > Can this be done easily?
> > > Any help is appreciated

Monday, March 19, 2012

Crashed db recovery

Hello,

i've got MS Windows Server 2003 OS (RAID 10) and MS SQL 2005, however server crashed with "blue death" screen. After forced reset, SQL server marked my db as a SUSPECT. Tried DBCC CHECKDB with no luck, what can i do to bring it back to life, please? (have no backup).

Msg 926, Level 14, State 1, Line 1
Database 'IRIX' cannot be opened. It has been marked SUSPECT by recovery. See the SQL Server errorlog for more information.
Msg 824, Level 24, State 2, Line 1
SQL Server detected a logical consistency-based I/O error: incorrect pageid (expected 1:182; actual 32:7602287). It occurred during a read of page (1:182) in database ID 5 at offset 0x0000000016c000 in file 'D:\MSSQL\IRIX.mdf'. Additional messages in the SQL Server error log or system event log may provide more detail. This is a severe error condition that threatens database integrity and must be corrected immediately. Complete a full database consistency check (DBCC CHECKDB). This error can be caused by many factors; for more information, see SQL Server Books Online.
Msg 3313, Level 21, State 2, Line 1
During redoing of a logged operation in database 'IRIX', an error occurred at log record ID (844:9770:2). Typically, the specific failure is previously logged as an error in the Windows Event Log service. Restore the database from a full backup, or repair the database.
Msg 3414, Level 21, State 1, Line 1
An error occurred during recovery, preventing the database 'IRIX' (database ID 5) from restarting. Diagnose the recovery errors and fix them, or restore from a known good backup. If errors are not corrected or expected, contact Technical Support.

SQL Server detected a logical consistency-based I/O error: incorrect pageid (expected 1:182; actual 32:7602287). It occurred during a read of page (1:182) in database ID 5 at offset 0x0000000016c000 in file 'D:\MSSQL\IRIX.mdf'. Additional messages in the SQL Server error log or system event log may provide more detail. This is a severe error condition that threatens database integrity and must be corrected immediately. Complete a full database consistency check (DBCC CHECKDB). This error can be caused by many factors; for more information, see SQL Server Books Online.
During redoing of a logged operation in database 'IRIX', an error occurred at log record ID (844:9770:2). Typically, the specific failure is previously logged as an error in the Windows Event Log service. Restore the database from a full backup, or repair the database.
An error occurred during recovery, preventing the database 'IRIX' (database ID 5) from restarting. Diagnose the recovery errors and fix them, or restore from a known good backup. If errors are not corrected or expected, contact Technical Support.
Cannot drop the table '#tmp_sp_db_vardecimal_storage_format', because it does not exist or you do not have permission. (Microsoft SQL Server, Error: 824)

you make use of sp_resetstatus sp....just execute it and you need to restart sql services i believe...if its not working you can try,

just bring the db into emergency mode and you can extract the data alone to another db....
alter database DBNAME set emergency

and export the datas using ssis

|||

Oh dear, it's so easy... i've reset the status, bring it to emergency and exported the data...ALL! Amazing, thank you for your kind help.

Sunday, March 11, 2012

CR / VB Sort Problem

OS= Windows XP SP2 Professional, Crystal Report XI Developers Edition, Visual Basic 6 Enterprise

Below is the code I am trying to use to print a report. This code except for this "ElseIf" statement is used and works fine to print other reports. The only difference between printing this report and the other reports is 1) the name of the report and 2) this is the only report I have a sortfield.
.
.
.
ElseIf ReportName = "Alternate.rpt" Then
Set craxreport = craxapp.OpenReport(ReportPath & "\" & ReportName)
Set dbTable = craxreport.Database.Tables(1)
craxreport.RecordSortFields(0) = "+{AlternatePart.dbEnterDate}"
End If
craxreport.DiscardSavedData
' ****************************************************************************************
CrystalActiveXReportViewer1.Refresh
Call CrystalActiveXReportViewer1_RefreshButtonClicked(True)
CrystalActiveXReportViewer1.ReportSource = craxreport
CrystalActiveXReportViewer1.ViewReport
Set craxreport = Nothing
When I run this code, if I keep "craxreport.RecordSortFields(0) " I get a "Subscript out of Range" error.

The report is defined as follows:
- It uses a table called "AlternatePart"
- There are 3 fields in the table - (dbOldPn, dbNewPn, and dbEnterDate)
- All fields are described at TEXT
- The report shows all 3 fields.
- I used the Record Sort Expert to set up one sort field and that is "AlternatePart.dbEnterDate"

Can you tell me what is possibly causing the above error?

ThanksI have a little experience with crax.
I think maybe Indexes are not zero based. try index 1|||Yes try using craxreport.RecordSortFields(1) and see if it is working|||Yes try using craxreport.RecordSortFields(1) and see if it is working
That was the problem -- thank you.

Sam

Thursday, March 8, 2012

CPU utilization not even

Hi
I have windows 2000 advanced server and sql server 2000 EE. It is used for
datawarehouse . A report, which used to finish in 20 minutes now it is
taking 4 hours to finish. It is same query same indexes same database. I
look at the perf counters one cpu(4) is at 70% (average)utilization others
are at 3% (average). I also checked the interupts are 560/sec, I guess it
is something abnormal, can someone help me out
Thanks,
Subbu.
Hi
In the server settings, check that Maximum Query Plan threshold is at least
5. Looks like your query is only using 1 processor. Also check the SQL
Server is configured to use all processors.
Both are on the Processor tab on the Properties page for the Server.
Regards
Mike Epprecht, Microsoft SQL Server MVP
Zurich, Switzerland
IM: mike@.epprecht.net
MVP Program: http://www.microsoft.com/mvp
Blog: http://www.msmvps.com/epprecht/
"Subbaiahd" <subbaiahd@.hotmail.com> wrote in message
news:#P4yg0A9EHA.2156@.TK2MSFTNGP10.phx.gbl...
> Hi
> I have windows 2000 advanced server and sql server 2000 EE. It is used for
> datawarehouse . A report, which used to finish in 20 minutes now it is
> taking 4 hours to finish. It is same query same indexes same database. I
> look at the perf counters one cpu(4) is at 70% (average)utilization others
> are at 3% (average). I also checked the interupts are 560/sec, I guess
it
> is something abnormal, can someone help me out
> Thanks,
> Subbu.
>

CPU utilization not even

Hi
I have Windows 2000 advanced server and sql server 2000 EE. It is used for
datawarehouse . A report, which used to finish in 20 minutes now it is
taking 4 hours to finish. It is same query same indexes same database. I
look at the perf counters one cpu(4) is at 70% (average)utilization others
are at 3% (average). I also checked the interupts are 560/sec, I guess it
is something abnormal, can someone help me out
Thanks,
Subbu.Hi
In the server settings, check that Maximum Query Plan threshold is at least
5. Looks like your query is only using 1 processor. Also check the SQL
Server is configured to use all processors.
Both are on the Processor tab on the Properties page for the Server.
Regards
--
Mike Epprecht, Microsoft SQL Server MVP
Zurich, Switzerland
IM: mike@.epprecht.net
MVP Program: http://www.microsoft.com/mvp
Blog: http://www.msmvps.com/epprecht/
"Subbaiahd" <subbaiahd@.hotmail.com> wrote in message
news:#P4yg0A9EHA.2156@.TK2MSFTNGP10.phx.gbl...
> Hi
> I have Windows 2000 advanced server and sql server 2000 EE. It is used for
> datawarehouse . A report, which used to finish in 20 minutes now it is
> taking 4 hours to finish. It is same query same indexes same database. I
> look at the perf counters one cpu(4) is at 70% (average)utilization others
> are at 3% (average). I also checked the interupts are 560/sec, I guess
it
> is something abnormal, can someone help me out
> Thanks,
> Subbu.
>

CPU utilization not even

Hi
I have windows 2000 advanced server and sql server 2000 EE. It is used for
datawarehouse . A report, which used to finish in 20 minutes now it is
taking 4 hours to finish. It is same query same indexes same database. I
look at the perf counters one cpu(4) is at 70% (average)utilization others
are at 3% (average). I also checked the interupts are 560/sec, I guess it
is something abnormal, can someone help me out
Thanks,
Subbu.Hi
In the server settings, check that Maximum Query Plan threshold is at least
5. Looks like your query is only using 1 processor. Also check the SQL
Server is configured to use all processors.
Both are on the Processor tab on the Properties page for the Server.
Regards
--
Mike Epprecht, Microsoft SQL Server MVP
Zurich, Switzerland
IM: mike@.epprecht.net
MVP Program: http://www.microsoft.com/mvp
Blog: http://www.msmvps.com/epprecht/
"Subbaiahd" <subbaiahd@.hotmail.com> wrote in message
news:#P4yg0A9EHA.2156@.TK2MSFTNGP10.phx.gbl...
> Hi
> I have windows 2000 advanced server and sql server 2000 EE. It is used for
> datawarehouse . A report, which used to finish in 20 minutes now it is
> taking 4 hours to finish. It is same query same indexes same database. I
> look at the perf counters one cpu(4) is at 70% (average)utilization others
> are at 3% (average). I also checked the interupts are 560/sec, I guess
it
> is something abnormal, can someone help me out
> Thanks,
> Subbu.
>

CPU usage 100% Resources failed

I have a server that Windows 2000 Advanced Server that is
clustered (Active/Active) with SQL Server 2000 with SP3A.
There are periods when the server CPU usage can be at 100%
for 30 minutes or longer.
What some of the resources that should fail when the CPU
usage is at this rate for an extended period of time?
Thanks,
Mike
Mike, you are losing me here. The system get pegged at 100% CPU and you want
to know what?
Cheers,
Rod
MVP - Windows Server - Clustering
http://www.nw-america.com - Clustering
"Mike" <anonymous@.discussions.microsoft.com> wrote in message
news:2910d01c464e9$6b435970$a401280a@.phx.gbl...
> I have a server that Windows 2000 Advanced Server that is
> clustered (Active/Active) with SQL Server 2000 with SP3A.
> There are periods when the server CPU usage can be at 100%
> for 30 minutes or longer.
> What some of the resources that should fail when the CPU
> usage is at this rate for an extended period of time?
> Thanks,
> Mike
|||failover will only happen when the heartbeat fails.
100% CPU doesn't mean that failover will happen.
"Rodney R. Fournier [MVP]" wrote:

> Mike, you are losing me here. The system get pegged at 100% CPU and you want
> to know what?
> Cheers,
> Rod
> MVP - Windows Server - Clustering
> http://www.nw-america.com - Clustering
> "Mike" <anonymous@.discussions.microsoft.com> wrote in message
> news:2910d01c464e9$6b435970$a401280a@.phx.gbl...
>
>
|||some of the resources that can fail when cpu usage is 100%?
Well if SQL Server cannot reply to the IsAlive queries than SQL Server resource can fail.
What resources fail in your case?
Best Regards,
Uttam Parui
Microsoft Corporation
This posting is provided "AS IS" with no warranties, and confers no rights.
Are you secure? For information about the Strategic Technology Protection Program and to order your FREE Security Tool Kit, please visit http://www.microsoft.com/security.
Microsoft highly recommends that users with Internet access update their Microsoft software to better protect against viruses and security vulnerabilities. The easiest way to do this is to visit the following websites:
http://www.microsoft.com/protect
http://www.microsoft.com/security/guidance/default.mspx

CPU usage (HIGH, MEDIUM, LOW)

I have a Windows 2000 Advanced Server with SQL Server
2000 Enterprise Edition with SP3A.
The server has approximately 300 users and CPU usage
stays
states between 90 to 100 % for at least 200 minutes out
of a 12 hour processing windows in time.
What is a good metric for LOW, MEDIUIM, and HIGH CPU
usage rates.
Thanks,
Mark
Hi,
90-100% for more than 3 hours will be a problem. Idetify the process (might
be batch) causing the bottle neck.
If you find that there is no much optimization can be done in your program ,
probably go for 1 more additional CPU.
Low : Less than 20%
Medium : Between 21 % to 60%
High : Any thing above 60%
Thanks
Hari
MCDBA
"Mark" <anonymous@.discussions.microsoft.com> wrote in message
news:222e501c45d80$90292870$a601280a@.phx.gbl...
> I have a Windows 2000 Advanced Server with SQL Server
> 2000 Enterprise Edition with SP3A.
> The server has approximately 300 users and CPU usage
> stays
> states between 90 to 100 % for at least 200 minutes out
> of a 12 hour processing windows in time.
> What is a good metric for LOW, MEDIUIM, and HIGH CPU
> usage rates.
> Thanks,
> Mark
>
>
|||Hi,
90-100% for more than 3 hours will be a problem. Idetify the process (might
be batch) causing the bottle neck.
If you find that there is no much optimization can be done in your program ,
probably go for 1 more additional CPU.
Low : Less than 20%
Medium : Between 21 % to 60%
High : Any thing above 60%
Thanks
Hari
MCDBA
"Mark" <anonymous@.discussions.microsoft.com> wrote in message
news:222e501c45d80$90292870$a601280a@.phx.gbl...
> I have a Windows 2000 Advanced Server with SQL Server
> 2000 Enterprise Edition with SP3A.
> The server has approximately 300 users and CPU usage
> stays
> states between 90 to 100 % for at least 200 minutes out
> of a 12 hour processing windows in time.
> What is a good metric for LOW, MEDIUIM, and HIGH CPU
> usage rates.
> Thanks,
> Mark
>
>

CPU usage (HIGH, MEDIUM, LOW)

I have a Windows 2000 Advanced Server with SQL Server
2000 Enterprise Edition with SP3A.
The server has approximately 300 users and CPU usage
stays
states between 90 to 100 % for at least 200 minutes out
of a 12 hour processing windows in time.
What is a good metric for LOW, MEDIUIM, and HIGH CPU
usage rates.
Thanks,
MarkHi,
90-100% for more than 3 hours will be a problem. Idetify the process (might
be batch) causing the bottle neck.
If you find that there is no much optimization can be done in your program ,
probably go for 1 more additional CPU.
Low : Less than 20%
Medium : Between 21 % to 60%
High : Any thing above 60%
Thanks
Hari
MCDBA
"Mark" <anonymous@.discussions.microsoft.com> wrote in message
news:222e501c45d80$90292870$a601280a@.phx
.gbl...
> I have a Windows 2000 Advanced Server with SQL Server
> 2000 Enterprise Edition with SP3A.
> The server has approximately 300 users and CPU usage
> stays
> states between 90 to 100 % for at least 200 minutes out
> of a 12 hour processing windows in time.
> What is a good metric for LOW, MEDIUIM, and HIGH CPU
> usage rates.
> Thanks,
> Mark
>
>

CPU usage (HIGH, MEDIUM, LOW)

I have a Windows 2000 Advanced Server with SQL Server
2000 Enterprise Edition with SP3A.
The server has approximately 300 users and CPU usage
stays
states between 90 to 100 % for at least 200 minutes out
of a 12 hour processing windows in time.
What is a good metric for LOW, MEDIUIM, and HIGH CPU
usage rates.
Thanks,
MarkHi,
90-100% for more than 3 hours will be a problem. Idetify the process (might
be batch) causing the bottle neck.
If you find that there is no much optimization can be done in your program ,
probably go for 1 more additional CPU.
Low : Less than 20%
Medium : Between 21 % to 60%
High : Any thing above 60%
--
Thanks
Hari
MCDBA
"Mark" <anonymous@.discussions.microsoft.com> wrote in message
news:222e501c45d80$90292870$a601280a@.phx.gbl...
> I have a Windows 2000 Advanced Server with SQL Server
> 2000 Enterprise Edition with SP3A.
> The server has approximately 300 users and CPU usage
> stays
> states between 90 to 100 % for at least 200 minutes out
> of a 12 hour processing windows in time.
> What is a good metric for LOW, MEDIUIM, and HIGH CPU
> usage rates.
> Thanks,
> Mark
>
>

Wednesday, March 7, 2012

CPU spike with SQL Server 2005 Express Edition

Hello,
Any help is greatly appreciated.
I've recently upgraded my MSDE 2000 to SQL Server 2005 Express Edition
on my Windows 2003 server SP1. I've noticted that running a single one
table query against the new sql engine causes the sqlservr.exe process
to consume 98% of CPU on the server. The spike usually lasts for a
20-30 seconds and I can't figure out why. The query is a simple select
query against one table with only 3000 records.
The server is a one proc machine 2.0 GHz celeron, with 1 GB of RAM.
Again, any suggestions are appreciated.
Thank you.support@.holylandmarket.com wrote:
> Hello,
> Any help is greatly appreciated.
> I've recently upgraded my MSDE 2000 to SQL Server 2005 Express Edition
> on my Windows 2003 server SP1. I've noticted that running a single one
> table query against the new sql engine causes the sqlservr.exe process
> to consume 98% of CPU on the server. The spike usually lasts for a
> 20-30 seconds and I can't figure out why. The query is a simple select
> query against one table with only 3000 records.
> The server is a one proc machine 2.0 GHz celeron, with 1 GB of RAM.
> Again, any suggestions are appreciated.
> Thank you.
>
What does the execution plan for the query look like? What is the
heaviest part of the plan?
Tracy McKibben
MCDBA
http://www.realsqlguy.com|||It looks like the index scan is the heaviest part of the plan. The
problem is I get high CPU spikes regardless of the query I run.
Again the db is very small with very little data in it. I have the
same db running on Windows XP and I don't see any CPU spikes.
Isam
Thanks.
Tracy McKibben wrote:
> support@.holylandmarket.com wrote:
> What does the execution plan for the query look like? What is the
> heaviest part of the plan?
>
> --
> Tracy McKibben
> MCDBA
> http://www.realsqlguy.com|||Isam wrote:
> It looks like the index scan is the heaviest part of the plan. The
> problem is I get high CPU spikes regardless of the query I run.
> Again the db is very small with very little data in it. I have the
> same db running on Windows XP and I don't see any CPU spikes.
>
You say the spike lasts for 20-30 seconds - how long does the query run?
Do you have auto-close enabled on this database?
Tracy McKibben
MCDBA
http://www.realsqlguy.com|||The query comes back in sub second but the process is still pegged.
Auto Close is set to True on the db.
Isam
Tracy McKibben wrote:
> Isam wrote:
> You say the spike lasts for 20-30 seconds - how long does the query run?
> Do you have auto-close enabled on this database?
>
> --
> Tracy McKibben
> MCDBA
> http://www.realsqlguy.com|||Isam wrote:
> The query comes back in sub second but the process is still pegged.
> Auto Close is set to True on the db.
>
Turn that off and see if the spike goes away...
Tracy McKibben
MCDBA
http://www.realsqlguy.com|||I can still see it spike. Turning auto close off did not change much.
Isam
support@.holylandmarket.com wrote:
> Hello,
> Any help is greatly appreciated.
> I've recently upgraded my MSDE 2000 to SQL Server 2005 Express Edition
> on my Windows 2003 server SP1. I've noticted that running a single one
> table query against the new sql engine causes the sqlservr.exe process
> to consume 98% of CPU on the server. The spike usually lasts for a
> 20-30 seconds and I can't figure out why. The query is a simple select
> query against one table with only 3000 records.
> The server is a one proc machine 2.0 GHz celeron, with 1 GB of RAM.
> Again, any suggestions are appreciated.
> Thank you.

CPU spike with SQL Server 2005 Express Edition

Hello,

Any help is greatly appreciated.

I've recently upgraded my MSDE 2000 to SQL Server 2005 Express Edition
on my Windows 2003 server SP1. I've noticted that running a single one

table query against the new sql engine causes the sqlservr.exe process
to consume 98% of CPU on the server. The spike usually lasts for a
20-30 seconds and I can't figure out why. The query is a simple select

query against one table with only 3000 records.

The server is a one proc machine 2.0 GHz celeron, with 1 GB of RAM.

Again, any suggestions are appreciated.

Thank you.Isam wrote:

Quote:

Originally Posted by

Hello,
>
Any help is greatly appreciated.
>
>
I've recently upgraded my MSDE 2000 to SQL Server 2005 Express Edition
on my Windows 2003 server SP1. I've noticted that running a single one
>
table query against the new sql engine causes the sqlservr.exe process
to consume 98% of CPU on the server. The spike usually lasts for a
20-30 seconds and I can't figure out why. The query is a simple select
>
query against one table with only 3000 records.
>
>
The server is a one proc machine 2.0 GHz celeron, with 1 GB of RAM.
>
>
Again, any suggestions are appreciated.
>
>
Thank you.


Oddly, the first rule of computer trouble shooting would seem to apply
here: When in doubt, turn off the Microsoft product. :P

Seriously, does this happen even after restarting SQL and/or the
server?

That's about it for me. I don't know enough to really trouble shoot
this, but I can point out the obvious with the best of them. ;)

--Richard|||Isam (support@.holylandmarket.com) writes:

Quote:

Originally Posted by

I've recently upgraded my MSDE 2000 to SQL Server 2005 Express Edition
on my Windows 2003 server SP1. I've noticted that running a single one
table query against the new sql engine causes the sqlservr.exe process
to consume 98% of CPU on the server. The spike usually lasts for a
20-30 seconds and I can't figure out why. The query is a simple select
query against one table with only 3000 records.


Does the query run for 20-30 seconds, or does it complete long before
the CPU spike is over?

Care to post the query and the definition for the underlying table?

By the way, did you run UPDATE STATISTICS on all tables after the upgrade?

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx

CPU spike with SQL Server 2005 Express Edition

Hello,
Any help is greatly appreciated.
I've recently upgraded my MSDE 2000 to SQL Server 2005 Express Edition
on my Windows 2003 server SP1. I've noticted that running a single one
table query against the new sql engine causes the sqlservr.exe process
to consume 98% of CPU on the server. The spike usually lasts for a
20-30 seconds and I can't figure out why. The query is a simple select
query against one table with only 3000 records.
The server is a one proc machine 2.0 GHz celeron, with 1 GB of RAM.
Again, any suggestions are appreciated.
Thank you.support@.holylandmarket.com wrote:
> Hello,
> Any help is greatly appreciated.
> I've recently upgraded my MSDE 2000 to SQL Server 2005 Express Edition
> on my Windows 2003 server SP1. I've noticted that running a single one
> table query against the new sql engine causes the sqlservr.exe process
> to consume 98% of CPU on the server. The spike usually lasts for a
> 20-30 seconds and I can't figure out why. The query is a simple select
> query against one table with only 3000 records.
> The server is a one proc machine 2.0 GHz celeron, with 1 GB of RAM.
> Again, any suggestions are appreciated.
> Thank you.
>
What does the execution plan for the query look like? What is the
heaviest part of the plan?
Tracy McKibben
MCDBA
http://www.realsqlguy.com|||It looks like the index scan is the heaviest part of the plan. The
problem is I get high CPU spikes regardless of the query I run.
Again the db is very small with very little data in it. I have the
same db running on Windows XP and I don't see any CPU spikes.
Isam
Thanks.
Tracy McKibben wrote:
> support@.holylandmarket.com wrote:
> > Hello,
> >
> > Any help is greatly appreciated.
> >
> > I've recently upgraded my MSDE 2000 to SQL Server 2005 Express Edition
> > on my Windows 2003 server SP1. I've noticted that running a single one
> > table query against the new sql engine causes the sqlservr.exe process
> > to consume 98% of CPU on the server. The spike usually lasts for a
> > 20-30 seconds and I can't figure out why. The query is a simple select
> > query against one table with only 3000 records.
> >
> > The server is a one proc machine 2.0 GHz celeron, with 1 GB of RAM.
> >
> > Again, any suggestions are appreciated.
> >
> > Thank you.
> >
> What does the execution plan for the query look like? What is the
> heaviest part of the plan?
>
> --
> Tracy McKibben
> MCDBA
> http://www.realsqlguy.com|||Isam wrote:
> It looks like the index scan is the heaviest part of the plan. The
> problem is I get high CPU spikes regardless of the query I run.
> Again the db is very small with very little data in it. I have the
> same db running on Windows XP and I don't see any CPU spikes.
>
You say the spike lasts for 20-30 seconds - how long does the query run?
Do you have auto-close enabled on this database?
Tracy McKibben
MCDBA
http://www.realsqlguy.com|||The query comes back in sub second but the process is still pegged.
Auto Close is set to True on the db.
Isam
Tracy McKibben wrote:
> Isam wrote:
> > It looks like the index scan is the heaviest part of the plan. The
> > problem is I get high CPU spikes regardless of the query I run.
> >
> > Again the db is very small with very little data in it. I have the
> > same db running on Windows XP and I don't see any CPU spikes.
> >
> You say the spike lasts for 20-30 seconds - how long does the query run?
> Do you have auto-close enabled on this database?
>
> --
> Tracy McKibben
> MCDBA
> http://www.realsqlguy.com|||Isam wrote:
> The query comes back in sub second but the process is still pegged.
> Auto Close is set to True on the db.
>
Turn that off and see if the spike goes away...
Tracy McKibben
MCDBA
http://www.realsqlguy.com|||I can still see it spike. Turning auto close off did not change much.
Isam
support@.holylandmarket.com wrote:
> Hello,
> Any help is greatly appreciated.
> I've recently upgraded my MSDE 2000 to SQL Server 2005 Express Edition
> on my Windows 2003 server SP1. I've noticted that running a single one
> table query against the new sql engine causes the sqlservr.exe process
> to consume 98% of CPU on the server. The spike usually lasts for a
> 20-30 seconds and I can't figure out why. The query is a simple select
> query against one table with only 3000 records.
> The server is a one proc machine 2.0 GHz celeron, with 1 GB of RAM.
> Again, any suggestions are appreciated.
> Thank you.

CPU related

I have +4 CPU do i have to enable Use windows NT Fibers.
from
DollerRequired to, no. May benefit from, yes. On a dedicated SQL server you
should boot priority and enable fibers after testing. If this server is not
dedicated to SQL Server then you will run the risk of throttling the other
services on the server.
burt_king@.yahoo.com
"doller" wrote:
> I have +4 CPU do i have to enable Use windows NT Fibers.
> from
> Doller
>|||burt_king wrote:
> Required to, no. May benefit from, yes. On a dedicated SQL server
> you should boot priority and enable fibers after testing. If this
> server is not dedicated to SQL Server then you will run the risk of
> throttling the other services on the server.
>
>> I have +4 CPU do i have to enable Use windows NT Fibers.
>> from
>> Doller
The general recommendation with "boost priority" is not to use it ever.
Even on dedicated servers, I have not read any reports that it actually
helps performance and have a many items in the past that talk about
problems. My understanding is that "NT Fibers" might show a small
performance boost if you're server is experiencing a lot of context
switching.
My general recommendation is to leave those two options alone and look
for other places to boost performance like your SQL, disk subsystems,
and physical database layout including your backup location.
David Gugick
Quest Software
www.imceda.com
www.quest.com|||Dear David,
As we had faced lot of performance problem last year so we have enabled
the fibre switch and it helped us.But the problem rasied is that we
canot create jobs and cant even execute them.
I searched a lot about this bug and forund a topic on microsoft web
site .They say to disable fibre option .After disabling the fibre
option we restart the services then our job workes fine.
but we can't use Windows NT fiber option.
Is windows NT fibers related to jobs.
from
Doller

CPU related

I have +4 CPU do i have to enable Use windows NT Fibers.
from
Doller
Required to, no. May benefit from, yes. On a dedicated SQL server you
should boot priority and enable fibers after testing. If this server is not
dedicated to SQL Server then you will run the risk of throttling the other
services on the server.
burt_king@.yahoo.com
"doller" wrote:

> I have +4 CPU do i have to enable Use windows NT Fibers.
> from
> Doller
>
|||burt_king wrote:[vbcol=seagreen]
> Required to, no. May benefit from, yes. On a dedicated SQL server
> you should boot priority and enable fibers after testing. If this
> server is not dedicated to SQL Server then you will run the risk of
> throttling the other services on the server.
>
The general recommendation with "boost priority" is not to use it ever.
Even on dedicated servers, I have not read any reports that it actually
helps performance and have a many items in the past that talk about
problems. My understanding is that "NT Fibers" might show a small
performance boost if you're server is experiencing a lot of context
switching.
My general recommendation is to leave those two options alone and look
for other places to boost performance like your SQL, disk subsystems,
and physical database layout including your backup location.
David Gugick
Quest Software
www.imceda.com
www.quest.com

CPU related

I have +4 CPU do i have to enable Use windows NT Fibers.
from
DollerRequired to, no. May benefit from, yes. On a dedicated SQL server you
should boot priority and enable fibers after testing. If this server is not
dedicated to SQL Server then you will run the risk of throttling the other
services on the server.
burt_king@.yahoo.com
"doller" wrote:

> I have +4 CPU do i have to enable Use windows NT Fibers.
> from
> Doller
>|||burt_king wrote:[vbcol=seagreen]
> Required to, no. May benefit from, yes. On a dedicated SQL server
> you should boot priority and enable fibers after testing. If this
> server is not dedicated to SQL Server then you will run the risk of
> throttling the other services on the server.
>
>
The general recommendation with "boost priority" is not to use it ever.
Even on dedicated servers, I have not read any reports that it actually
helps performance and have a many items in the past that talk about
problems. My understanding is that "NT Fibers" might show a small
performance boost if you're server is experiencing a lot of context
switching.
My general recommendation is to leave those two options alone and look
for other places to boost performance like your SQL, disk subsystems,
and physical database layout including your backup location.
David Gugick
Quest Software
www.imceda.com
www.quest.com

Sunday, February 19, 2012

Counting rows ?

Hi,
Using SQL Server 2000 on Windows 2000 Server, i'm working on a 2 milion rows
database, in one table. The data are document and the column of my table are
"authors", "organisation", "title","abstract" (these four ones are fulltext
indexed) and other column with references number.
A web application provide a fulltext search on these four column (author,
affiliation, title and abstract). It provide the result in two step :
- First one : give the number of result
- Second one : give a link to the documents extracted from the database
(Because if there is too much document, the user can change his query to
have less.)
So my problem is the first step is too long : 30 to 60 seconds. I use this
query :
Code:
select count(*) from import
where (CONTAINS ( Title,'"tube" AND "heat"')
OR CONTAINS ( Abstract,'"tube" AND "heat"'))
AND (CONTAINS ( Organisation,'"CIA" OR "FNSEA"'))
and cast(PY as integer)>=2004
and cast(PY as integer)<=2006Is there a better solution ?
Thanks,
Jean-Michel
There is no real way to optimize this search. You might get some improvement
by rearranging this clause
and cast(PY as integer)>=2004
and cast(PY as integer)<=2006
I am not sure what the data type for PY is, but the cast operation will be
expensive for you as you are doing it against each row returned from the
contains results sets.
Hilary Cotter
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602.html
Looking for a FAQ on Indexing Services/SQL FTS
http://www.indexserverfaq.com
"Famille Careil" <careil04@.tele2.fr> wrote in message
news:Ob6X1A0aFHA.3280@.TK2MSFTNGP09.phx.gbl...
> Hi,
> Using SQL Server 2000 on Windows 2000 Server, i'm working on a 2 milion
rows
> database, in one table. The data are document and the column of my table
are
> "authors", "organisation", "title","abstract" (these four ones are
fulltext
> indexed) and other column with references number.
> A web application provide a fulltext search on these four column (author,
> affiliation, title and abstract). It provide the result in two step :
> - First one : give the number of result
> - Second one : give a link to the documents extracted from the database
> (Because if there is too much document, the user can change his query to
> have less.)
> So my problem is the first step is too long : 30 to 60 seconds. I use this
> query :
>
> Code:
> select count(*) from import
> where (CONTAINS ( Title,'"tube" AND "heat"')
> OR CONTAINS ( Abstract,'"tube" AND "heat"'))
> AND (CONTAINS ( Organisation,'"CIA" OR "FNSEA"'))
> and cast(PY as integer)>=2004
> and cast(PY as integer)<=2006Is there a better solution ?
>
> Thanks,
> Jean-Michel
>