Showing posts with label related. Show all posts
Showing posts with label related. Show all posts

Tuesday, March 20, 2012

Create a constant calculated member

Hi everyone, my question is related to creating a calculated member that must not change when the user changes the selected filter in the dimensions.
I will try to explain it: I need the total number of rows of my cube for a concrete year (year is a dimension). I will use this value in a calculated formula to define the percentage of rows that contains a concrete value (selected in the filter dimension). It means that my calculated member has a factor that change with the filter (ie. number of rows with value 'XXX'), divide by a factor that not change (total number of rows per year).

I have tried in many different ways, but I did not succeed. I cannot believe this is not possible...Maybe some sample code or DDL will help...

I keep thinking cement for some reason...|||Hi Brett, I understand your question since this is quite strange, nevertheless, I will try to explain.

The main reason is that I need to detect the quality of the data (used to join with the dimensions) in the fact table . The formula to do that is getting the rate of rows with the correct information. One way to do it will be just dividing the total number of rows by the number of rows with dimension column equals null.

I can calculate the total number of rows creating a calculated member like "count([mydimension].AllMembers)", but this number of rows is true only for the dimension "mydimension". Other possibility is using a column with value 1 for every row, and doing a sum([mydimension].AllMembers,mycolumn1), but this is true only when I filter for mydimension.

So, the real question is, how can I get the real total number of rows, valid for any dimension?.

I hope this helps...
thx in advance
Oscar

Thursday, March 8, 2012

CPU Usage Management and SQL

Right now SQL Server accounts for 80% of my CPU Usage
which is being maxed out and causing my sites to run
slow. I fear this may be related to me not properly
closing connections (I am lazy) and me just hosting too
many databases on one server. Regardless, I need insight
on how to manage this and reset CPU Usage from a SQL
perspective. Also, is there a way to see all connections
opened and individually close them? Will that help me
you think? Thanks in advance.if you execute
sp_who
or
sp_who2
from Query Analyzer, you will be shown a listing of the open =connections.
Generally, open connections would not cause CPU usage, but they do use =memory. If there are lots of open (unused) connections then your memory =usage might be higher than it needs to be. ADO should be using =connection pooling, so I would not expect that you lots of extra =connections, but I could be wrong.
The best tool to see what is going on within the database server is =Profiler (within the SQL Server program group). It will show you what =sql commands are being sent to the database.
If you can identify which stored procedures/select statements cause =problems (take a long time to run, use gobs of CPU) you might be able to =re-write them or add indexes so that the system does not have to work as =hard to retrieve the information that the application is asking for.
-- Keith, SQL Server MVP
"ASP Dev" <dmicheli@.cmiti.com> wrote in message =news:030201c34b03$bead2830$a301280a@.phx.gbl...
> Right now SQL Server accounts for 80% of my CPU Usage > which is being maxed out and causing my sites to run > slow. I fear this may be related to me not properly > closing connections (I am lazy) and me just hosting too > many databases on one server. Regardless, I need insight > on how to manage this and reset CPU Usage from a SQL > perspective. Also, is there a way to see all connections > opened and individually close them? Will that help me > you think? Thanks in advance.|||Thank You. I appreciate your help.
>--Original Message--
>if you execute
>sp_who
>or
>sp_who2
>from Query Analyzer, you will be shown a listing of the
open connections.
>Generally, open connections would not cause CPU usage,
but they do use memory. If there are lots of open
(unused) connections then your memory usage might be
higher than it needs to be. ADO should be using
connection pooling, so I would not expect that you lots
of extra connections, but I could be wrong.
>The best tool to see what is going on within the
database server is Profiler (within the SQL Server
program group). It will show you what sql commands are
being sent to the database.
>If you can identify which stored procedures/select
statements cause problems (take a long time to run, use
gobs of CPU) you might be able to re-write them or add
indexes so that the system does not have to work as hard
to retrieve the information that the application is
asking for.
>--
>Keith, SQL Server MVP
>"ASP Dev" <dmicheli@.cmiti.com> wrote in message
news:030201c34b03$bead2830$a301280a@.phx.gbl...
>> Right now SQL Server accounts for 80% of my CPU Usage
>> which is being maxed out and causing my sites to run
>> slow. I fear this may be related to me not properly
>> closing connections (I am lazy) and me just hosting
too
>> many databases on one server. Regardless, I need
insight
>> on how to manage this and reset CPU Usage from a SQL
>> perspective. Also, is there a way to see all
connections
>> opened and individually close them? Will that help me
>> you think? Thanks in advance.
>.
>|||we had a history of CPU usage here. At peak time the usage would stay up to
100% staying there for 10, 15 minutes. Last few weeks it stayed above 95%
for couple hrs. The problem was poor SQL statements and the management
reports. All queries were pure SELECT or UPDATE, no stored procedures used.
During business hrs the big bosses just wanted to see how well the business
so far for the day so they just clicked whenever they want to produce an
up-to-date online report!!!. I've warned the bosses and developers about
this but it was kind of they prefered to upgrade h/w than rewriting the
codes and kept running reports ... whenever they want. After a meeting with
the big bosses they agreed it was a big problem and promised not to run
report during peak hrs. And just next day we saw them running report like
crazy again! Until last month the web site got timed-out then they took it
seriously. Spending 50-100K for both s/w and h/w wasn't an easy thing so
the developres rewrote the main interface for the site using pure stored
procedures. They also locked out the report tool, kept it from runinng b/w
10am and 2pm. Result: it rocked. Using Insight Manager I see the avg. CPU
usage is b/w 10 and 15%. Everyone is happy with it.:-).
"ASP Dev" <dmicheli@.cmiti.com> wrote in message
news:030201c34b03$bead2830$a301280a@.phx.gbl...
> Right now SQL Server accounts for 80% of my CPU Usage
> which is being maxed out and causing my sites to run
> slow. I fear this may be related to me not properly
> closing connections (I am lazy) and me just hosting too
> many databases on one server. Regardless, I need insight
> on how to manage this and reset CPU Usage from a SQL
> perspective. Also, is there a way to see all connections
> opened and individually close them? Will that help me
> you think? Thanks in advance.|||On Tue, 15 Jul 2003 14:19:06 -0700, "Flicker"
<hthan@.superioraccess.com> wrote:
>They also locked out the report tool, kept it from runinng b/w
>10am and 2pm. Result: it rocked.
How about you replicate the data out to a mart/warehouse so they can
click all they want?
J.|||there is no need for the report to run every few minutes. The chairman does
it; the VPs do it; the CEO does it. This is kind of more a habit than a
business need, just like you are checking stocks every hour. After we
locked them out, they do other things to kill time.:) Hey, they are also
the OWNERS. So long ...
"JXStern" <JXSternChangeX2R@.gte.net> wrote in message
news:6fuahv4s12omn7l7k3c1jfp3ivs2up3h8h@.4ax.com...
> On Tue, 15 Jul 2003 14:19:06 -0700, "Flicker"
> <hthan@.superioraccess.com> wrote:
> >They also locked out the report tool, kept it from runinng b/w
> >10am and 2pm. Result: it rocked.
> How about you replicate the data out to a mart/warehouse so they can
> click all they want?
> J.
>

Wednesday, March 7, 2012

CPU related

I have +4 CPU do i have to enable Use windows NT Fibers.
from
DollerRequired to, no. May benefit from, yes. On a dedicated SQL server you
should boot priority and enable fibers after testing. If this server is not
dedicated to SQL Server then you will run the risk of throttling the other
services on the server.
burt_king@.yahoo.com
"doller" wrote:
> I have +4 CPU do i have to enable Use windows NT Fibers.
> from
> Doller
>|||burt_king wrote:
> Required to, no. May benefit from, yes. On a dedicated SQL server
> you should boot priority and enable fibers after testing. If this
> server is not dedicated to SQL Server then you will run the risk of
> throttling the other services on the server.
>
>> I have +4 CPU do i have to enable Use windows NT Fibers.
>> from
>> Doller
The general recommendation with "boost priority" is not to use it ever.
Even on dedicated servers, I have not read any reports that it actually
helps performance and have a many items in the past that talk about
problems. My understanding is that "NT Fibers" might show a small
performance boost if you're server is experiencing a lot of context
switching.
My general recommendation is to leave those two options alone and look
for other places to boost performance like your SQL, disk subsystems,
and physical database layout including your backup location.
David Gugick
Quest Software
www.imceda.com
www.quest.com|||Dear David,
As we had faced lot of performance problem last year so we have enabled
the fibre switch and it helped us.But the problem rasied is that we
canot create jobs and cant even execute them.
I searched a lot about this bug and forund a topic on microsoft web
site .They say to disable fibre option .After disabling the fibre
option we restart the services then our job workes fine.
but we can't use Windows NT fiber option.
Is windows NT fibers related to jobs.
from
Doller

CPU related

I have +4 CPU do i have to enable Use windows NT Fibers.
from
Doller
Required to, no. May benefit from, yes. On a dedicated SQL server you
should boot priority and enable fibers after testing. If this server is not
dedicated to SQL Server then you will run the risk of throttling the other
services on the server.
burt_king@.yahoo.com
"doller" wrote:

> I have +4 CPU do i have to enable Use windows NT Fibers.
> from
> Doller
>
|||burt_king wrote:[vbcol=seagreen]
> Required to, no. May benefit from, yes. On a dedicated SQL server
> you should boot priority and enable fibers after testing. If this
> server is not dedicated to SQL Server then you will run the risk of
> throttling the other services on the server.
>
The general recommendation with "boost priority" is not to use it ever.
Even on dedicated servers, I have not read any reports that it actually
helps performance and have a many items in the past that talk about
problems. My understanding is that "NT Fibers" might show a small
performance boost if you're server is experiencing a lot of context
switching.
My general recommendation is to leave those two options alone and look
for other places to boost performance like your SQL, disk subsystems,
and physical database layout including your backup location.
David Gugick
Quest Software
www.imceda.com
www.quest.com

CPU related

I have +4 CPU do i have to enable Use windows NT Fibers.
from
DollerRequired to, no. May benefit from, yes. On a dedicated SQL server you
should boot priority and enable fibers after testing. If this server is not
dedicated to SQL Server then you will run the risk of throttling the other
services on the server.
burt_king@.yahoo.com
"doller" wrote:

> I have +4 CPU do i have to enable Use windows NT Fibers.
> from
> Doller
>|||burt_king wrote:[vbcol=seagreen]
> Required to, no. May benefit from, yes. On a dedicated SQL server
> you should boot priority and enable fibers after testing. If this
> server is not dedicated to SQL Server then you will run the risk of
> throttling the other services on the server.
>
>
The general recommendation with "boost priority" is not to use it ever.
Even on dedicated servers, I have not read any reports that it actually
helps performance and have a many items in the past that talk about
problems. My understanding is that "NT Fibers" might show a small
performance boost if you're server is experiencing a lot of context
switching.
My general recommendation is to leave those two options alone and look
for other places to boost performance like your SQL, disk subsystems,
and physical database layout including your backup location.
David Gugick
Quest Software
www.imceda.com
www.quest.com