Showing posts with label run. Show all posts
Showing posts with label run. Show all posts

Thursday, March 29, 2012

Create and access database through ftp

I have to create a website and a database, but I 've only access for the server by ftp.

Is there a possibility to create and run some kind of sql script automaticly if I've only have access via ftp?!

(Using sqlserver 2000)

tia john

No, to the best of my knowledge you cannot run a SQL script solely through FTP. Surely your ISP must give you some sort of interface for creating tables?

Create a view in VBA

Hi,

Is it possible to create some SQL in VBA, 'run it' then view as a datasheet
the results?

I'm trying to throw together a fairly large search form, and VBA seems the
best way to parse the parameters.

Cheers,
Chris"Not Me" <Not.Me@.faker.fake.fa.kee> wrote in message
news:c6rag7$kt$1@.ucsnew1.ncl.ac.uk...
> Hi,
> Is it possible to create some SQL in VBA, 'run it' then view as a
datasheet
> the results?
> I'm trying to throw together a fairly large search form, and VBA seems the
> best way to parse the parameters.
> Cheers,
> Chris

Yes - you could use the ADO COM objects, then return results to your client
and format/present them there. If you need a more precise answer, then it
would help to have some more detail about what you're trying to do and what
tools you're using (eg. Access, Excel etc.).

Simon|||"Simon Hayes" <sql@.hayes.ch> wrote in message
news:40913ac0$1_1@.news.bluewin.ch...
> "Not Me" <Not.Me@.faker.fake.fa.kee> wrote in message
> news:c6rag7$kt$1@.ucsnew1.ncl.ac.uk...
> > Hi,
> > Is it possible to create some SQL in VBA, 'run it' then view as a
> datasheet
> > the results?
> > I'm trying to throw together a fairly large search form, and VBA seems
the
> > best way to parse the parameters.
> Yes - you could use the ADO COM objects, then return results to your
client
> and format/present them there. If you need a more precise answer, then it
> would help to have some more detail about what you're trying to do and
what
> tools you're using (eg. Access, Excel etc.).

Thanks for the reply.
In the past, I have been able to pre-write a query in Access, (i.e. have it
saved in the queries list) and use docmd.openquery in VBA to have it
displayed on screen.
I have also been able to create queries 'on the fly' in VBA, and be able to
iterate through them/work on them in the same program.
What I'd like to do is sort of combine the above, so create a query in VBA
and have it displayed in a datasheet view. So the user in an ADE would
select from a few comboboxes, press go and up would pop the resulting query.

Cheers,
Chrissql

Tuesday, March 27, 2012

Create a Trigger to run a Stored Procedure

I want to create a Trigger that will run once a day at a set time.

This trigger would actually run a Stored Procedure named say "spAppData".

This SP will in turn run and delete the entire tables data, and re-insert data into a table we'll call APPS.

Would anyone know how to help me get started on creating a Trigger with the info I've provided?

Thank you,

NBK

To schedule an action, you want a SQL Agent Job to execute your stored procedure at the determined time.

If you are using SQL Express (which does NOT include SQL Agent), you can use a combination of a Windows Scheduler task, SQLCmd.exe, and a Stored Procedure

|||

Thank you Arnie. That's just the direction I needed.

Wallace

|||can you please give more details on this.
i have same requrement.
thanks in advance

Thursday, March 22, 2012

create a local (portable) app plus data

We have an Access app that uses tables from an SQL server. Is it possible to
copy the SQL tables localy into the Access mdb file and run the app locally?
What about table relationships? How can I tell if they might be an issue?
Some relationships are built in the diagram tool on the server, and some are
coded into the app itself.
Thanks a heap - Randy
Randy,
Why don't you use MSDE (http://www.microsoft.com/sql/msde/default.mspx )
instead of Access? MDSE is SQL Server, desktop edition, and it is free. This
way you could have exactly the same structure in local databases as you have
on SQL Server.
Dejan Sarka, SQL Server MVP
Mentor
www.SolidQualityLearning.com
"RandyNesst" <RandyNesst@.discussions.microsoft.com> wrote in message
news:1382ED81-4064-4B05-956C-7DD5EF2D08FA@.microsoft.com...
> We have an Access app that uses tables from an SQL server. Is it possible
> to
> copy the SQL tables localy into the Access mdb file and run the app
> locally?
> What about table relationships? How can I tell if they might be an issue?
> Some relationships are built in the diagram tool on the server, and some
> are
> coded into the app itself.
> Thanks a heap - Randy
|||That sounds like a really good idea! I did a little looking around at the
link you gave me, but I have maybe a simple question - what is the process
like to create the MSDE localized database? Just an overview - is it about a
1 hour thing or about a 1 or more day thing? Is there a tool to do it?
Thanks again,
Randy
"Dejan Sarka" wrote:

> Randy,
> Why don't you use MSDE (http://www.microsoft.com/sql/msde/default.mspx )
> instead of Access? MDSE is SQL Server, desktop edition, and it is free. This
> way you could have exactly the same structure in local databases as you have
> on SQL Server.
> --
> Dejan Sarka, SQL Server MVP
> Mentor
> www.SolidQualityLearning.com
> "RandyNesst" <RandyNesst@.discussions.microsoft.com> wrote in message
> news:1382ED81-4064-4B05-956C-7DD5EF2D08FA@.microsoft.com...
>
>
|||Randy,
I am not quite sure whether I understand the question. If you mean the
process of creating a local copy of the server database, it is quite simple:
you can script all objects from the server db in Enterprise Manager and then
execute the script in Query Analyzer on the MSDE, and you have a local copy
with the same structure, but without data. if you need a copy of the data as
well, you can use Data Transformation Services, Merge Replication, ...
Dejan Sarka, SQL Server MVP
Mentor
www.SolidQualityLearning.com
"RandyNesst" <RandyNesst@.discussions.microsoft.com> wrote in message
news:62706E3A-F34B-4ABF-AA75-11600E920014@.microsoft.com...[vbcol=seagreen]
> That sounds like a really good idea! I did a little looking around at the
> link you gave me, but I have maybe a simple question - what is the process
> like to create the MSDE localized database? Just an overview - is it
> about a
> 1 hour thing or about a 1 or more day thing? Is there a tool to do it?
> Thanks again,
> Randy
> "Dejan Sarka" wrote:
|||At this point, you can download SQL Express 2005 (free also) and distro that
with your app.
William Stacey [MVP]
"RandyNesst" <RandyNesst@.discussions.microsoft.com> wrote in message
news:62706E3A-F34B-4ABF-AA75-11600E920014@.microsoft.com...[vbcol=seagreen]
> That sounds like a really good idea! I did a little looking around at the
> link you gave me, but I have maybe a simple question - what is the process
> like to create the MSDE localized database? Just an overview - is it
> about a
> 1 hour thing or about a 1 or more day thing? Is there a tool to do it?
> Thanks again,
> Randy
> "Dejan Sarka" wrote:

Tuesday, March 20, 2012

create a .txt file from a table using sql script

Hi all,
Im new on SQL. I need to create a sql script to be run in Query Analyzer.
This script must create a table in .txt format on a server folder
(d:\backup\tablename.txt). I know I can make it using DTS Export Wizard, but
I need to make it with many tables (about 30) and many times (every week).
Your time is really appreciate it.
Thanks
Ped_esc
Seems you are looking at generating DDL and/or data for that DDL. I have listed some options here:
http://www.karaszi.com/SQLServer/inf...ate_script.asp
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
"Ped_Esc" <PedEsc@.discussions.microsoft.com> wrote in message
news:FB03A791-F776-4B57-B163-82F392CB0AF4@.microsoft.com...
> Hi all,
> Im new on SQL. I need to create a sql script to be run in Query Analyzer.
> This script must create a table in .txt format on a server folder
> (d:\backup\tablename.txt). I know I can make it using DTS Export Wizard, but
> I need to make it with many tables (about 30) and many times (every week).
> Your time is really appreciate it.
> Thanks
> Ped_esc
>
sql

create a .txt file from a table using sql script

Hi all,
Im new on SQL. I need to create a sql script to be run in Query Analyzer.
This script must create a table in .txt format on a server folder
(d:\backup\tablename.txt). I know I can make it using DTS Export Wizard, but
I need to make it with many tables (about 30) and many times (every week).
Your time is really appreciate it.
Thanks
Ped_escSeems you are looking at generating DDL and/or data for that DDL. I have listed some options here:
http://www.karaszi.com/SQLServer/info_generate_script.asp
--
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
"Ped_Esc" <PedEsc@.discussions.microsoft.com> wrote in message
news:FB03A791-F776-4B57-B163-82F392CB0AF4@.microsoft.com...
> Hi all,
> Im new on SQL. I need to create a sql script to be run in Query Analyzer.
> This script must create a table in .txt format on a server folder
> (d:\backup\tablename.txt). I know I can make it using DTS Export Wizard, but
> I need to make it with many tables (about 30) and many times (every week).
> Your time is really appreciate it.
> Thanks
> Ped_esc
>

Monday, March 19, 2012

Crash with "Failed Assertion"

Hello,
I have a table with 2 indexes.
My C++ program deletes some data from this table and then
does bcp data in. It does it twice in one run.
After I run my program for the first time I get the
correct result. When I run it again, the first bcp is
successful, however there is no data from the second bcp.
If I repeat it again, the result is correct; one more
time - the data from the first bcp only.
If I try to debug it, everything works fine.
If I run it without break points in the program, I
duplicate a problem.
SQL Server log records an error:
Failed Assertion = '(logMode != nonlogged) || (dbt-
>dbt_dbid == TEMPDBID)'
and creates a SQLdump file, which I don't know how to
interprit.
If I remove indexes from the table, everything works fine.
Does anyone understand what's going on?
I would appriciate any help.
Thanks.Are your indexes unique indexes? Chances are thats why it's failing the
second time, if you're using the same data in you're BCP load.
-Morgan
"Tanya Bardakh" <tbardakh@.dysanalytics.com> wrote in message
news:0cf001c39d5d$a3068a20$a401280a@.phx.gbl...
> Hello,
>
> I have a table with 2 indexes.
> My C++ program deletes some data from this table and then
> does bcp data in. It does it twice in one run.
> After I run my program for the first time I get the
> correct result. When I run it again, the first bcp is
> successful, however there is no data from the second bcp.
> If I repeat it again, the result is correct; one more
> time - the data from the first bcp only.
> If I try to debug it, everything works fine.
> If I run it without break points in the program, I
> duplicate a problem.
> SQL Server log records an error:
> Failed Assertion = '(logMode != nonlogged) || (dbt-
> >dbt_dbid == TEMPDBID)'
> and creates a SQLdump file, which I don't know how to
> interprit.
> If I remove indexes from the table, everything works fine.
> Does anyone understand what's going on?
> I would appriciate any help.
> Thanks.
>
>|||Thank you for the reply.
My indexes are not unique, nor cluster.
Tanya.
>--Original Message--
>Are your indexes unique indexes? Chances are thats why
it's failing the
>second time, if you're using the same data in you're BCP
load.
>-Morgan
>"Tanya Bardakh" <tbardakh@.dysanalytics.com> wrote in
message
>news:0cf001c39d5d$a3068a20$a401280a@.phx.gbl...
>> Hello,
>>
>> I have a table with 2 indexes.
>> My C++ program deletes some data from this table and
then
>> does bcp data in. It does it twice in one run.
>> After I run my program for the first time I get the
>> correct result. When I run it again, the first bcp is
>> successful, however there is no data from the second
bcp.
>> If I repeat it again, the result is correct; one more
>> time - the data from the first bcp only.
>> If I try to debug it, everything works fine.
>> If I run it without break points in the program, I
>> duplicate a problem.
>> SQL Server log records an error:
>> Failed Assertion = '(logMode != nonlogged) || (dbt-
>> >dbt_dbid == TEMPDBID)'
>> and creates a SQLdump file, which I don't know how to
>> interprit.
>> If I remove indexes from the table, everything works
fine.
>> Does anyone understand what's going on?
>> I would appriciate any help.
>> Thanks.
>>
>
>.
>|||Any replication on the tables involved?
How about triggers?
If the answer to the above is "no", the only thing I can think of is that
you're running out of space in the logfile and you have configured for
"Simple" recovery model. It would work in "debug" because SQLServer would
have time to remove the previous checkpoint...
Bruce
"Tanya Bardakh" <anonymous@.discussions.microsoft.com> wrote in message
news:0de201c39d64$5d354b10$a401280a@.phx.gbl...
> Thank you for the reply.
> My indexes are not unique, nor cluster.
> Tanya.
>
> >--Original Message--
> >Are your indexes unique indexes? Chances are thats why
> it's failing the
> >second time, if you're using the same data in you're BCP
> load.
> >
> >-Morgan
> >
> >"Tanya Bardakh" <tbardakh@.dysanalytics.com> wrote in
> message
> >news:0cf001c39d5d$a3068a20$a401280a@.phx.gbl...
> >> Hello,
> >>
> >>
> >> I have a table with 2 indexes.
> >> My C++ program deletes some data from this table and
> then
> >> does bcp data in. It does it twice in one run.
> >>
> >> After I run my program for the first time I get the
> >> correct result. When I run it again, the first bcp is
> >> successful, however there is no data from the second
> bcp.
> >> If I repeat it again, the result is correct; one more
> >> time - the data from the first bcp only.
> >>
> >> If I try to debug it, everything works fine.
> >>
> >> If I run it without break points in the program, I
> >> duplicate a problem.
> >>
> >> SQL Server log records an error:
> >> Failed Assertion = '(logMode != nonlogged) || (dbt-
> >> >dbt_dbid == TEMPDBID)'
> >> and creates a SQLdump file, which I don't know how to
> >> interprit.
> >>
> >> If I remove indexes from the table, everything works
> fine.
> >>
> >> Does anyone understand what's going on?
> >>
> >> I would appriciate any help.
> >> Thanks.
> >>
> >>
> >>
> >
> >
> >.
> >|||Thank you, Bruce
I don't have any triggers & replications.
Space should not be an issue, since I have plenty of it
(30 GB). Actually I am reloading same data (about 200
rows) over and over again. The only thing is it's
successful the first time and crashes SQL server on the
subsequent run.
>--Original Message--
>Any replication on the tables involved?
>How about triggers?
>If the answer to the above is "no", the only thing I can
think of is that
>you're running out of space in the logfile and you have
configured for
>"Simple" recovery model. It would work in "debug"
because SQLServer would
>have time to remove the previous checkpoint...
>Bruce
>"Tanya Bardakh" <anonymous@.discussions.microsoft.com>
wrote in message
>news:0de201c39d64$5d354b10$a401280a@.phx.gbl...
>> Thank you for the reply.
>> My indexes are not unique, nor cluster.
>> Tanya.
>>
>> >--Original Message--
>> >Are your indexes unique indexes? Chances are thats why
>> it's failing the
>> >second time, if you're using the same data in you're
BCP
>> load.
>> >
>> >-Morgan
>> >
>> >"Tanya Bardakh" <tbardakh@.dysanalytics.com> wrote in
>> message
>> >news:0cf001c39d5d$a3068a20$a401280a@.phx.gbl...
>> >> Hello,
>> >>
>> >>
>> >> I have a table with 2 indexes.
>> >> My C++ program deletes some data from this table and
>> then
>> >> does bcp data in. It does it twice in one run.
>> >>
>> >> After I run my program for the first time I get the
>> >> correct result. When I run it again, the first bcp is
>> >> successful, however there is no data from the second
>> bcp.
>> >> If I repeat it again, the result is correct; one more
>> >> time - the data from the first bcp only.
>> >>
>> >> If I try to debug it, everything works fine.
>> >>
>> >> If I run it without break points in the program, I
>> >> duplicate a problem.
>> >>
>> >> SQL Server log records an error:
>> >> Failed Assertion = '(logMode != nonlogged) || (dbt-
>> >> >dbt_dbid == TEMPDBID)'
>> >> and creates a SQLdump file, which I don't know how to
>> >> interprit.
>> >>
>> >> If I remove indexes from the table, everything works
>> fine.
>> >>
>> >> Does anyone understand what's going on?
>> >>
>> >> I would appriciate any help.
>> >> Thanks.
>> >>
>> >>
>> >>
>> >
>> >
>> >.
>> >
>
>.
>|||You should contact Product Support (http://support.microsoft.com) who will
be able to help you figure this out.
Regards,
Paul.
--
Paul Randal
DBCC Technical Lead, Microsoft SQL Server Storage Engine
This posting is provided "AS IS" with no warranties, and confers no rights.
"bcarson@.asgoth.com" <anonymous@.discussions.microsoft.com> wrote in message
news:027e01c39d82$9d9c2070$a301280a@.phx.gbl...
> Thank you, Bruce
> I don't have any triggers & replications.
> Space should not be an issue, since I have plenty of it
> (30 GB). Actually I am reloading same data (about 200
> rows) over and over again. The only thing is it's
> successful the first time and crashes SQL server on the
> subsequent run.
>
> >--Original Message--
> >Any replication on the tables involved?
> >How about triggers?
> >
> >If the answer to the above is "no", the only thing I can
> think of is that
> >you're running out of space in the logfile and you have
> configured for
> >"Simple" recovery model. It would work in "debug"
> because SQLServer would
> >have time to remove the previous checkpoint...
> >
> >Bruce
> >
> >"Tanya Bardakh" <anonymous@.discussions.microsoft.com>
> wrote in message
> >news:0de201c39d64$5d354b10$a401280a@.phx.gbl...
> >> Thank you for the reply.
> >>
> >> My indexes are not unique, nor cluster.
> >>
> >> Tanya.
> >>
> >>
> >> >--Original Message--
> >> >Are your indexes unique indexes? Chances are thats why
> >> it's failing the
> >> >second time, if you're using the same data in you're
> BCP
> >> load.
> >> >
> >> >-Morgan
> >> >
> >> >"Tanya Bardakh" <tbardakh@.dysanalytics.com> wrote in
> >> message
> >> >news:0cf001c39d5d$a3068a20$a401280a@.phx.gbl...
> >> >> Hello,
> >> >>
> >> >>
> >> >> I have a table with 2 indexes.
> >> >> My C++ program deletes some data from this table and
> >> then
> >> >> does bcp data in. It does it twice in one run.
> >> >>
> >> >> After I run my program for the first time I get the
> >> >> correct result. When I run it again, the first bcp is
> >> >> successful, however there is no data from the second
> >> bcp.
> >> >> If I repeat it again, the result is correct; one more
> >> >> time - the data from the first bcp only.
> >> >>
> >> >> If I try to debug it, everything works fine.
> >> >>
> >> >> If I run it without break points in the program, I
> >> >> duplicate a problem.
> >> >>
> >> >> SQL Server log records an error:
> >> >> Failed Assertion = '(logMode != nonlogged) || (dbt-
> >> >> >dbt_dbid == TEMPDBID)'
> >> >> and creates a SQLdump file, which I don't know how to
> >> >> interprit.
> >> >>
> >> >> If I remove indexes from the table, everything works
> >> fine.
> >> >>
> >> >> Does anyone understand what's going on?
> >> >>
> >> >> I would appriciate any help.
> >> >> Thanks.
> >> >>
> >> >>
> >> >>
> >> >
> >> >
> >> >.
> >> >
> >
> >
> >.
> >

Sunday, March 11, 2012

CR10 and Lotus Notes

We have a Win2K RAS server (CR10) with IIS. However, our applications run on Lotus Notes, which is another server. Previously, when we were using CR8 we generated a URL line and passed it on to the pageserver. Now, in version 10, we are supposed to write ASP code. My question is twofold:
1-where can I get sample code for calling a report in Lotusscript?
2-what module do I need to install on Lotus Notes server to support the code?

Thank you very much.Try searching on Crystal's Website:

http://support.businessobjects.com/search/advsearch.asp

CR Viewer Logon Failure

Hi Everbody,
I have developed a report in CR 9 and it is working perfectly well when run from CR. My problem is as follows:
I have developed a VB application containing a crystal report viewer. Here on click of a button I'm loading the CR by passing parameters to it.
But when I try to run the application its gives me an error as
Logon failed - Details:28000 [ODBC SQL Server Driver][SQL Server] Login failed for user "ab".
I have no code for database connection in VB, just the Openreport and viewReport commands.

Why am I getting such an error in VB when I 'm able to run the CR separately successfully with the same username/password?

Please help its really urgent.Do you have a DSN pointing to your database?|||Yes, I have a system DSN configured with SQL authentication to my required database server and I'm able to successfully connect to it.This DSN I'm using to create the crystal report.
Any reasons why I'm not able to run the VB application????|||How are you referencing your DSN in your code? Can you paste that function?

CR problem: error when open a project in VB

Hi,

When I open a VB-Access project (using CR in it) in VB, I got a error message "can't load crystl32.ocx". So the project cannot run.

This project was written by my friend, and her computer has already installed CR, but my comp doesn't. Although I've copy a crystl32.ocx to that folder, I still got that error.

So can I load this project without installing CR?Is crystl32.ocx registered properly? It may also need dependency files that are on your friend's system but not yours. Maybe your friend can create a setup file that includes all the files that are needed. You can run the setup file, then you should have no problem running the project in VB.

Or, if you don't need to use Crystal for what you need the project for, you can have your friend give you a copy of the project without the ocx component checked. Doing it this way would allow you to have access to the code, but you may not be able to run the program and you probably won't be able to compile it.

Thursday, March 8, 2012

CPU Usage Management and SQL

Right now SQL Server accounts for 80% of my CPU Usage
which is being maxed out and causing my sites to run
slow. I fear this may be related to me not properly
closing connections (I am lazy) and me just hosting too
many databases on one server. Regardless, I need insight
on how to manage this and reset CPU Usage from a SQL
perspective. Also, is there a way to see all connections
opened and individually close them? Will that help me
you think? Thanks in advance.if you execute
sp_who
or
sp_who2
from Query Analyzer, you will be shown a listing of the open =connections.
Generally, open connections would not cause CPU usage, but they do use =memory. If there are lots of open (unused) connections then your memory =usage might be higher than it needs to be. ADO should be using =connection pooling, so I would not expect that you lots of extra =connections, but I could be wrong.
The best tool to see what is going on within the database server is =Profiler (within the SQL Server program group). It will show you what =sql commands are being sent to the database.
If you can identify which stored procedures/select statements cause =problems (take a long time to run, use gobs of CPU) you might be able to =re-write them or add indexes so that the system does not have to work as =hard to retrieve the information that the application is asking for.
-- Keith, SQL Server MVP
"ASP Dev" <dmicheli@.cmiti.com> wrote in message =news:030201c34b03$bead2830$a301280a@.phx.gbl...
> Right now SQL Server accounts for 80% of my CPU Usage > which is being maxed out and causing my sites to run > slow. I fear this may be related to me not properly > closing connections (I am lazy) and me just hosting too > many databases on one server. Regardless, I need insight > on how to manage this and reset CPU Usage from a SQL > perspective. Also, is there a way to see all connections > opened and individually close them? Will that help me > you think? Thanks in advance.|||Thank You. I appreciate your help.
>--Original Message--
>if you execute
>sp_who
>or
>sp_who2
>from Query Analyzer, you will be shown a listing of the
open connections.
>Generally, open connections would not cause CPU usage,
but they do use memory. If there are lots of open
(unused) connections then your memory usage might be
higher than it needs to be. ADO should be using
connection pooling, so I would not expect that you lots
of extra connections, but I could be wrong.
>The best tool to see what is going on within the
database server is Profiler (within the SQL Server
program group). It will show you what sql commands are
being sent to the database.
>If you can identify which stored procedures/select
statements cause problems (take a long time to run, use
gobs of CPU) you might be able to re-write them or add
indexes so that the system does not have to work as hard
to retrieve the information that the application is
asking for.
>--
>Keith, SQL Server MVP
>"ASP Dev" <dmicheli@.cmiti.com> wrote in message
news:030201c34b03$bead2830$a301280a@.phx.gbl...
>> Right now SQL Server accounts for 80% of my CPU Usage
>> which is being maxed out and causing my sites to run
>> slow. I fear this may be related to me not properly
>> closing connections (I am lazy) and me just hosting
too
>> many databases on one server. Regardless, I need
insight
>> on how to manage this and reset CPU Usage from a SQL
>> perspective. Also, is there a way to see all
connections
>> opened and individually close them? Will that help me
>> you think? Thanks in advance.
>.
>|||we had a history of CPU usage here. At peak time the usage would stay up to
100% staying there for 10, 15 minutes. Last few weeks it stayed above 95%
for couple hrs. The problem was poor SQL statements and the management
reports. All queries were pure SELECT or UPDATE, no stored procedures used.
During business hrs the big bosses just wanted to see how well the business
so far for the day so they just clicked whenever they want to produce an
up-to-date online report!!!. I've warned the bosses and developers about
this but it was kind of they prefered to upgrade h/w than rewriting the
codes and kept running reports ... whenever they want. After a meeting with
the big bosses they agreed it was a big problem and promised not to run
report during peak hrs. And just next day we saw them running report like
crazy again! Until last month the web site got timed-out then they took it
seriously. Spending 50-100K for both s/w and h/w wasn't an easy thing so
the developres rewrote the main interface for the site using pure stored
procedures. They also locked out the report tool, kept it from runinng b/w
10am and 2pm. Result: it rocked. Using Insight Manager I see the avg. CPU
usage is b/w 10 and 15%. Everyone is happy with it.:-).
"ASP Dev" <dmicheli@.cmiti.com> wrote in message
news:030201c34b03$bead2830$a301280a@.phx.gbl...
> Right now SQL Server accounts for 80% of my CPU Usage
> which is being maxed out and causing my sites to run
> slow. I fear this may be related to me not properly
> closing connections (I am lazy) and me just hosting too
> many databases on one server. Regardless, I need insight
> on how to manage this and reset CPU Usage from a SQL
> perspective. Also, is there a way to see all connections
> opened and individually close them? Will that help me
> you think? Thanks in advance.|||On Tue, 15 Jul 2003 14:19:06 -0700, "Flicker"
<hthan@.superioraccess.com> wrote:
>They also locked out the report tool, kept it from runinng b/w
>10am and 2pm. Result: it rocked.
How about you replicate the data out to a mart/warehouse so they can
click all they want?
J.|||there is no need for the report to run every few minutes. The chairman does
it; the VPs do it; the CEO does it. This is kind of more a habit than a
business need, just like you are checking stocks every hour. After we
locked them out, they do other things to kill time.:) Hey, they are also
the OWNERS. So long ...
"JXStern" <JXSternChangeX2R@.gte.net> wrote in message
news:6fuahv4s12omn7l7k3c1jfp3ivs2up3h8h@.4ax.com...
> On Tue, 15 Jul 2003 14:19:06 -0700, "Flicker"
> <hthan@.superioraccess.com> wrote:
> >They also locked out the report tool, kept it from runinng b/w
> >10am and 2pm. Result: it rocked.
> How about you replicate the data out to a mart/warehouse so they can
> click all they want?
> J.
>

Wednesday, March 7, 2012

CPU Time? what is it sec, milli-sec?

when you run sp_who2, what is the numbers under CPUTime represent??
Second, milliseconds? or something elsethe CPU time is taken from the sysprocesses table, and the cpu column is defined as :-

Cumulative CPU time for the process. The entry is updated for all processes, regardless of whether the SET STATISTICS TIME ON option is ON or OFF.

not that this answers you question, but thats all i know ...|||nano second ie 1/1,00,000,0000|||A Microsoft KB article (Q170360) that I found confirms that CPU time from sysprocesses (which is where sp_who2 gets its data) is in milliseconds.|||nano second ie 1/1,00,000,0000

lol That would be interesting.

Saturday, February 25, 2012

CPU and MEMORY usage

Can some one tell me where can I get the CPU and Memory Usage of my server on the Sql Tables or a script or store proc that i could run to get the above. I need that and to store its results in history to show how are our servers perfomingexecute sp_who2 active

this will tell you whos is on your db and how much cpu (CPUTime) and mem (DISKIO) each user is taking

Cheers|||Actually gives the user activity accross the server you run it on. Not just the db activity - Sorry|||should use Performance Mointor to assess your server activity over a period of time.

you can log this info in .csv then import into SQL Server

have a look at :

www.sql-server-performance.com/performance_monitor_tips.asp

cheers|||On sp_who2 does the CPUTime equals to the CPU usage on the task manager and what about the Memory because i also want this memory to be the same as the one on the Task Manager|||Not sure about that. I think cputime that a sp_who will show the diskIO for each user and system query that occurs at the time of the SP execution. This does not take into consideration the O/S resources getting used.

I guess you could find out how much mem is allocated to SQL by looking at the properties of the server (right click on the server in Enterprise Manger) . From this figure you can work out how much resources are getting used in terms of how much mem has been allocated to SQL Server.

Cheers|||I 100% agree with you when you say we can manually look at properties and again on the task manager properties, the thing is we want a script that will do this for us automatically every hour to check the memory and cpu usage.|||Use performance mointor as it is the easiest way to shedule.

Look over the link above to see what counters you want to look at.

i.e Memory -- Pages/Sec, Memory Manager -- Total Server Memory (KB)
etc

What you could do is shedule it to output a csv file on the hour then set up a dts task to import this into SQL Server for anyalsis.

Once you know how much mem you allocated to SQL Server then you have constant that you can do calculations with.

Cheers|||Thanks, I think this will work perfect

CPU and Duration don't match in SQL Server Profiler

Hello All,

I have one store proc that is really slowing down my SQL server, at least according to the SQL Profiler. When I run the profiler and choose to show store procs that take longer then 3000 ms to execute, this is the only precedure that shows up. The CPU time and Duration is both in the 3000 ms mark. When I open up Query Analyzer and run that same store proc it takes less then a second, but when I check the Profiler to see the time, it's Blank (yellow) .. I'm using SQL 2005 Profiler. Any ideas why

1. I don't see a time when I execute the query from the profiler?

2. It says it takes 3 seconds when I only see it taking less then one?

Thank you

This is probably 3000 microseconds (3 milliseconds) :)

"In SQL Server 2005, the server reports the duration of an event in microseconds (one millionth, or 10-6, of a second) and the amount of CPU time used by the event in milliseconds (one thousandth, or 10-3, of a second). In SQL Server 2000, the server reported both duration and CPU time in milliseconds. In SQL Server 2005, the SQL Server Profiler graphical user interface displays the Duration column in milliseconds by default, but when a trace is saved to either a file or a database table, the Duration column value is written in microseconds."

Books Online:
http://msdn2.microsoft.com/en-us/library/ms175848.aspx

SQL Server Enthusiast:
http://weblogs.sqlteam.com/tarad/archive/2006/10/04/13621.aspx

Paul A. Mestemaker II
Program Manager
Microsoft SQL Server Manageability
http://blogs.msdn.com/sqlrem/

CPU

I would highly apprecite if any one guide me regarding CPU %, I check in the
task manager the CPU % goes to 100 and when I run the Profiler to findout
the query which query taking longer time then I didn'd find any query, when
I saw the Profiler all the SPs taking 0% CPU usage... what would be reason,
why CPU goes to 100%, any idea ?
Thanks
Profiler can add quite a load to an already busy server. You should create
a server side trace that traces directly to a file on a local drive to
minimize impact on the server. You can then import that trace into a table
or profiler for viewing later. Check out sp_tracecreate in BooksOnLine.
Alternately you can set up profiler how you want and then click on File -
Script Trace as 2000 to have it generate most of the code for you.
Andrew J. Kelly SQL MVP
"Rogers" <naissani@.hotmail.com> wrote in message
news:eyLJuhtPGHA.140@.TK2MSFTNGP12.phx.gbl...
>I would highly apprecite if any one guide me regarding CPU %, I check in
>the task manager the CPU % goes to 100 and when I run the Profiler to
>findout the query which query taking longer time then I didn'd find any
>query, when I saw the Profiler all the SPs taking 0% CPU usage... what
>would be reason, why CPU goes to 100%, any idea ?
> Thanks
>
|||Just want to add that the longest running query may not necessarily be the
query or queries that consume the most CPU or primarily responsible for
driving your processors to 100%.
Linchi
"Rogers" wrote:

> I would highly apprecite if any one guide me regarding CPU %, I check in the
> task manager the CPU % goes to 100 and when I run the Profiler to findout
> the query which query taking longer time then I didn'd find any query, when
> I saw the Profiler all the SPs taking 0% CPU usage... what would be reason,
> why CPU goes to 100%, any idea ?
> Thanks
>
>
|||Rogers wrote:
> I would highly apprecite if any one guide me regarding CPU %, I check
> in the task manager the CPU % goes to 100 and when I run the Profiler
> to findout the query which query taking longer time then I didn'd
> find any query, when I saw the Profiler all the SPs taking 0% CPU
> usage... what would be reason, why CPU goes to 100%, any idea ?
> Thanks
Are you sure it's the SQL Server service that's driving the CPU up? If
so, you can create a trace 9server-side preferred) and look for
SQL:BatchCompleted / RPC:Completed, and add a CPU filter of say, 500 to
start in order to see the high CPU users. Adjust the filter as needed.
As Linchi stated, long duration does not translate to high cpu. You
could have a highly parallel query run for a short time and consume a
lot of CPU. You could also have a long running transaction that consume
little CPU. You need to look at both metrics as they are both important
intheir own way.
David Gugick - SQL Server MVP
Quest Software
|||Thanks for your reply, how I can check the long running query? is there any
way I can check..
Thanks
"David Gugick" <david.gugick-nospam@.quest.com> wrote in message
news:ufXajvVQGHA.5248@.TK2MSFTNGP09.phx.gbl...
> Rogers wrote:
> Are you sure it's the SQL Server service that's driving the CPU up? If so,
> you can create a trace 9server-side preferred) and look for
> SQL:BatchCompleted / RPC:Completed, and add a CPU filter of say, 500 to
> start in order to see the high CPU users. Adjust the filter as needed. As
> Linchi stated, long duration does not translate to high cpu. You could
> have a highly parallel query run for a short time and consume a lot of
> CPU. You could also have a long running transaction that consume little
> CPU. You need to look at both metrics as they are both important intheir
> own way.
> --
> David Gugick - SQL Server MVP
> Quest Software
>
|||Rogers wrote:
> Thanks for your reply, how I can check the long running query? is
> there any way I can check..
>
With duration from the trace using the SQL:BatchCompleted or RPC:Completed
events or SP:StmtCompleted or SQL:StmtCompleted at a lower level.
David Gugick - SQL Server MVP
Quest Software

Friday, February 24, 2012

Couple of SQL Agent questions

Hi all,
I've got a customer that is creating a LOCAL PACKAGE/SQL AGENT Job to run a
TextImport.exe program on the server, which loads a table and creates a log
file.
They want to be able to print the log file - to a network printer, as the
final step. I do not see PRINT type tasks in the design window. Any
suggestions?
Also, what they have so far runs when they EXECUTE the package from LOCAL
PACKAGES. But when they try to START JOB from SQL AGENT it fails and they
get a message for step 1 saying:
"The step is improperly defined (it needs a valid owner and/or command) and
so could not be run. The step failed."
Thanks.Here is a suggestion.
Use the command task executing print.exe /d:lpt1 c:\filename.log
Most of the time when executing packages through Agent this is a security
issue.
Tushar
"Steve Z" <szlamany@.antarescomputing_no_spam.com> wrote in message
news:eNuHobsLEHA.2440@.TK2MSFTNGP10.phx.gbl...
> Hi all,
> I've got a customer that is creating a LOCAL PACKAGE/SQL AGENT Job to run
> a
> TextImport.exe program on the server, which loads a table and creates a
> log
> file.
> They want to be able to print the log file - to a network printer, as the
> final step. I do not see PRINT type tasks in the design window. Any
> suggestions?
> Also, what they have so far runs when they EXECUTE the package from LOCAL
> PACKAGES. But when they try to START JOB from SQL AGENT it fails and they
> get a message for step 1 saying:
> "The step is improperly defined (it needs a valid owner and/or command)
> and
> so could not be run. The step failed."
> Thanks.
>

Couple of SQL Agent questions

Hi all,
I've got a customer that is creating a LOCAL PACKAGE/SQL AGENT Job to run a
TextImport.exe program on the server, which loads a table and creates a log
file.
They want to be able to print the log file - to a network printer, as the
final step. I do not see PRINT type tasks in the design window. Any
suggestions?
Also, what they have so far runs when they EXECUTE the package from LOCAL
PACKAGES. But when they try to START JOB from SQL AGENT it fails and they
get a message for step 1 saying:
"The step is improperly defined (it needs a valid owner and/or command) and
so could not be run. The step failed."
Thanks.Here is a suggestion.
Use the command task executing print.exe /d:lpt1 c:\filename.log
Most of the time when executing packages through Agent this is a security
issue.
Tushar
"Steve Z" <szlamany@.antarescomputing_no_spam.com> wrote in message
news:eNuHobsLEHA.2440@.TK2MSFTNGP10.phx.gbl...
> Hi all,
> I've got a customer that is creating a LOCAL PACKAGE/SQL AGENT Job to run
> a
> TextImport.exe program on the server, which loads a table and creates a
> log
> file.
> They want to be able to print the log file - to a network printer, as the
> final step. I do not see PRINT type tasks in the design window. Any
> suggestions?
> Also, what they have so far runs when they EXECUTE the package from LOCAL
> PACKAGES. But when they try to START JOB from SQL AGENT it fails and they
> get a message for step 1 saying:
> "The step is improperly defined (it needs a valid owner and/or command)
> and
> so could not be run. The step failed."
> Thanks.
>

Couple of SQL Agent questions

Hi all,
I've got a customer that is creating a LOCAL PACKAGE/SQL AGENT Job to run a
TextImport.exe program on the server, which loads a table and creates a log
file.
They want to be able to print the log file - to a network printer, as the
final step. I do not see PRINT type tasks in the design window. Any
suggestions?
Also, what they have so far runs when they EXECUTE the package from LOCAL
PACKAGES. But when they try to START JOB from SQL AGENT it fails and they
get a message for step 1 saying:
"The step is improperly defined (it needs a valid owner and/or command) and
so could not be run. The step failed."
Thanks.
Here is a suggestion.
Use the command task executing print.exe /d:lpt1 c:\filename.log
Most of the time when executing packages through Agent this is a security
issue.
Tushar
"Steve Z" <szlamany@.antarescomputing_no_spam.com> wrote in message
news:eNuHobsLEHA.2440@.TK2MSFTNGP10.phx.gbl...
> Hi all,
> I've got a customer that is creating a LOCAL PACKAGE/SQL AGENT Job to run
> a
> TextImport.exe program on the server, which loads a table and creates a
> log
> file.
> They want to be able to print the log file - to a network printer, as the
> final step. I do not see PRINT type tasks in the design window. Any
> suggestions?
> Also, what they have so far runs when they EXECUTE the package from LOCAL
> PACKAGES. But when they try to START JOB from SQL AGENT it fails and they
> get a message for step 1 saying:
> "The step is improperly defined (it needs a valid owner and/or command)
> and
> so could not be run. The step failed."
> Thanks.
>

Couple of questions

1) To use or not to use the 3GB switch. We have 8GB of RAM and just run SQL
Server with awe and PAE Enabled. I have heard that 3GB is good and I have
also heard 3 GB is bad with the above setup..
2) We have a 32 bit system and we run into worker thread issues such all 256
worker threads are used and all incoming connections are rejected. Should I
be increasing those worker threads to say 512 ?
3) What is stored in the address space above the 4GB limit ..i.e when i use
awe to address say 6GB , is it only data/index pages that reside in that
region of memory above the 4GB limit ?
4) Is user connection memory, procedure cache stored in mem to leave area ?
We are using SQL 2000 EE
ThanksAnswers Inline
"Hassan" <Hassan@.hotmail.com> wrote in message
news:%2300axugfGHA.2068@.TK2MSFTNGP02.phx.gbl...
> 1) To use or not to use the 3GB switch. We have 8GB of RAM and just run
> SQL Server with awe and PAE Enabled. I have heard that 3GB is good and I
> have also heard 3 GB is bad with the above setup..
>
3GB is usually OK up to 12 GB total system RAM. Above that it is not
recommended

> 2) We have a 32 bit system and we run into worker thread issues such all
> 256 worker threads are used and all incoming connections are rejected.
> Should I be increasing those worker threads to say 512 ?
How many processers do you have? Are they running above 90% at the time of
the incoming rejections?

> 3) What is stored in the address space above the 4GB limit ..i.e when i
> use awe to address say 6GB , is it only data/index pages that reside in
> that region of memory above the 4GB limit ?
>
AWE memory is not addressable and is used for data cache only.

> 4) Is user connection memory, procedure cache stored in mem to leave area
> ?
No, with one exception. If you are using large packets, then user
connection memory may come out of mem-to-leave if the system has to use OS
network buffers rather than SQL network buffers.

> We are using SQL 2000 EE
> Thanks
>
>
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP|||Thanks Geoff..
1) So what are we missing out on for not using 3GB if we have less than 12GB
of RAM ?
2) We have seen worker thread problems on our high OLTP systems. Some
possible causes could be h/w related such as disk IO issue causing slowness
in response times all revolving around blocking..So say if the blocking
sproc runs for 10 secs all of a sudden, but there are around 1000 sprocs
that want to execute and read of that blocked table, then in no time, we run
out of threads and connections are not honored.
3) With AWE , does data flow from the non addressable memory to addressable
memory ?
"Geoff N. Hiten" <SQLCraftsman@.gmail.com> wrote in message
news:ehrOAIhfGHA.3652@.TK2MSFTNGP02.phx.gbl...
> Answers Inline
> "Hassan" <Hassan@.hotmail.com> wrote in message
> news:%2300axugfGHA.2068@.TK2MSFTNGP02.phx.gbl...
> 3GB is usually OK up to 12 GB total system RAM. Above that it is not
> recommended
>
> How many processers do you have? Are they running above 90% at the time
> of the incoming rejections?
>
> AWE memory is not addressable and is used for data cache only.
>
> No, with one exception. If you are using large packets, then user
> connection memory may come out of mem-to-leave if the system has to use OS
> network buffers rather than SQL network buffers.
>
>
> --
> Geoff N. Hiten
> Senior Database Administrator
> Microsoft SQL Server MVP
>
>|||More Inline
"Hassan" <Hassan@.hotmail.com> wrote in message
news:%23dYzJghfGHA.1456@.TK2MSFTNGP04.phx.gbl...
> Thanks Geoff..
> 1) So what are we missing out on for not using 3GB if we have less than
> 12GB of RAM ?
>
/3GB allows for more SQL addressable memory. Depending on your system, this
may improve performance. As with any config change, test both before and
after to see if it improves things. In most cases with system memory above
4GB and less than or equal to 12GB, you will see an improvement.

> 2) We have seen worker thread problems on our high OLTP systems. Some
> possible causes could be h/w related such as disk IO issue causing
> slowness in response times all revolving around blocking..So say if the
> blocking sproc runs for 10 secs all of a sudden, but there are around 1000
> sprocs that want to execute and read of that blocked table, then in no
> time, we run out of threads and connections are not honored.
>
Monitor your current disk queue length counters for your data and log disks.
If you are having IO stalls, you will need to improve the IO subsystem
before any worker thread count improvements will help. This is why I
hesitate to recommend a specific change over the newsgroups. You need to
find the root cause of your performance issues. Sometimes one issue, such
as an I/O weakness, can show up as memory or CPU pressure to casual
observation.

> 3) With AWE , does data flow from the non addressable memory to
> addressable memory ?
>
That is why you have to limit the use of the /3GB switch. AWE is much like
the old EXTENDED and EXPANDED RAM from the '268 days where a window of low
(or in this case addressable) memory is used to map into the non-addressable
memory. This is done via the OS memory area, so if you squeeze it too much
on very large memory systems, you introduce some severe performance issues.
The OS needs the extra memory to handle to address windows.
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP

> "Geoff N. Hiten" <SQLCraftsman@.gmail.com> wrote in message
> news:ehrOAIhfGHA.3652@.TK2MSFTNGP02.phx.gbl...
>

Couple of questions

1) To use or not to use the 3GB switch. We have 8GB of RAM and just run SQL
Server with awe and PAE Enabled. I have heard that 3GB is good and I have
also heard 3 GB is bad with the above setup..
2) We have a 32 bit system and we run into worker thread issues such all 256
worker threads are used and all incoming connections are rejected. Should I
be increasing those worker threads to say 512 ?
3) What is stored in the address space above the 4GB limit ..i.e when i use
awe to address say 6GB , is it only data/index pages that reside in that
region of memory above the 4GB limit ?
4) Is user connection memory, procedure cache stored in mem to leave area ?
We are using SQL 2000 EE
ThanksAnswers Inline
"Hassan" <Hassan@.hotmail.com> wrote in message
news:%2300axugfGHA.2068@.TK2MSFTNGP02.phx.gbl...
> 1) To use or not to use the 3GB switch. We have 8GB of RAM and just run
> SQL Server with awe and PAE Enabled. I have heard that 3GB is good and I
> have also heard 3 GB is bad with the above setup..
>
3GB is usually OK up to 12 GB total system RAM. Above that it is not
recommended
> 2) We have a 32 bit system and we run into worker thread issues such all
> 256 worker threads are used and all incoming connections are rejected.
> Should I be increasing those worker threads to say 512 ?
How many processers do you have? Are they running above 90% at the time of
the incoming rejections?
> 3) What is stored in the address space above the 4GB limit ..i.e when i
> use awe to address say 6GB , is it only data/index pages that reside in
> that region of memory above the 4GB limit ?
>
AWE memory is not addressable and is used for data cache only.
> 4) Is user connection memory, procedure cache stored in mem to leave area
> ?
No, with one exception. If you are using large packets, then user
connection memory may come out of mem-to-leave if the system has to use OS
network buffers rather than SQL network buffers.
> We are using SQL 2000 EE
> Thanks
>
>
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP|||Thanks Geoff..
1) So what are we missing out on for not using 3GB if we have less than 12GB
of RAM ?
2) We have seen worker thread problems on our high OLTP systems. Some
possible causes could be h/w related such as disk IO issue causing slowness
in response times all revolving around blocking..So say if the blocking
sproc runs for 10 secs all of a sudden, but there are around 1000 sprocs
that want to execute and read of that blocked table, then in no time, we run
out of threads and connections are not honored.
3) With AWE , does data flow from the non addressable memory to addressable
memory ?
"Geoff N. Hiten" <SQLCraftsman@.gmail.com> wrote in message
news:ehrOAIhfGHA.3652@.TK2MSFTNGP02.phx.gbl...
> Answers Inline
> "Hassan" <Hassan@.hotmail.com> wrote in message
> news:%2300axugfGHA.2068@.TK2MSFTNGP02.phx.gbl...
>> 1) To use or not to use the 3GB switch. We have 8GB of RAM and just run
>> SQL Server with awe and PAE Enabled. I have heard that 3GB is good and I
>> have also heard 3 GB is bad with the above setup..
> 3GB is usually OK up to 12 GB total system RAM. Above that it is not
> recommended
>> 2) We have a 32 bit system and we run into worker thread issues such all
>> 256 worker threads are used and all incoming connections are rejected.
>> Should I be increasing those worker threads to say 512 ?
> How many processers do you have? Are they running above 90% at the time
> of the incoming rejections?
>> 3) What is stored in the address space above the 4GB limit ..i.e when i
>> use awe to address say 6GB , is it only data/index pages that reside in
>> that region of memory above the 4GB limit ?
> AWE memory is not addressable and is used for data cache only.
>> 4) Is user connection memory, procedure cache stored in mem to leave area
>> ?
> No, with one exception. If you are using large packets, then user
> connection memory may come out of mem-to-leave if the system has to use OS
> network buffers rather than SQL network buffers.
>
>> We are using SQL 2000 EE
>> Thanks
>>
>
> --
> Geoff N. Hiten
> Senior Database Administrator
> Microsoft SQL Server MVP
>
>|||More Inline
"Hassan" <Hassan@.hotmail.com> wrote in message
news:%23dYzJghfGHA.1456@.TK2MSFTNGP04.phx.gbl...
> Thanks Geoff..
> 1) So what are we missing out on for not using 3GB if we have less than
> 12GB of RAM ?
>
/3GB allows for more SQL addressable memory. Depending on your system, this
may improve performance. As with any config change, test both before and
after to see if it improves things. In most cases with system memory above
4GB and less than or equal to 12GB, you will see an improvement.
> 2) We have seen worker thread problems on our high OLTP systems. Some
> possible causes could be h/w related such as disk IO issue causing
> slowness in response times all revolving around blocking..So say if the
> blocking sproc runs for 10 secs all of a sudden, but there are around 1000
> sprocs that want to execute and read of that blocked table, then in no
> time, we run out of threads and connections are not honored.
>
Monitor your current disk queue length counters for your data and log disks.
If you are having IO stalls, you will need to improve the IO subsystem
before any worker thread count improvements will help. This is why I
hesitate to recommend a specific change over the newsgroups. You need to
find the root cause of your performance issues. Sometimes one issue, such
as an I/O weakness, can show up as memory or CPU pressure to casual
observation.
> 3) With AWE , does data flow from the non addressable memory to
> addressable memory ?
>
That is why you have to limit the use of the /3GB switch. AWE is much like
the old EXTENDED and EXPANDED RAM from the '268 days where a window of low
(or in this case addressable) memory is used to map into the non-addressable
memory. This is done via the OS memory area, so if you squeeze it too much
on very large memory systems, you introduce some severe performance issues.
The OS needs the extra memory to handle to address windows.
--
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
> "Geoff N. Hiten" <SQLCraftsman@.gmail.com> wrote in message
> news:ehrOAIhfGHA.3652@.TK2MSFTNGP02.phx.gbl...
>> Answers Inline
>> "Hassan" <Hassan@.hotmail.com> wrote in message
>> news:%2300axugfGHA.2068@.TK2MSFTNGP02.phx.gbl...
>> 1) To use or not to use the 3GB switch. We have 8GB of RAM and just run
>> SQL Server with awe and PAE Enabled. I have heard that 3GB is good and I
>> have also heard 3 GB is bad with the above setup..
>>
>> 3GB is usually OK up to 12 GB total system RAM. Above that it is not
>> recommended
>> 2) We have a 32 bit system and we run into worker thread issues such all
>> 256 worker threads are used and all incoming connections are rejected.
>> Should I be increasing those worker threads to say 512 ?
>> How many processers do you have? Are they running above 90% at the time
>> of the incoming rejections?
>>
>> 3) What is stored in the address space above the 4GB limit ..i.e when i
>> use awe to address say 6GB , is it only data/index pages that reside in
>> that region of memory above the 4GB limit ?
>> AWE memory is not addressable and is used for data cache only.
>> 4) Is user connection memory, procedure cache stored in mem to leave
>> area ?
>> No, with one exception. If you are using large packets, then user
>> connection memory may come out of mem-to-leave if the system has to use
>> OS network buffers rather than SQL network buffers.
>>
>> We are using SQL 2000 EE
>> Thanks
>>
>>
>> --
>> Geoff N. Hiten
>> Senior Database Administrator
>> Microsoft SQL Server MVP
>>
>