Load Optimization
Hi Experts,
Can you please tell me the load optimization techniques in BW. I need to enhance the performance of a process chain which used to take less time earlier for a particular but takes more time for very few extra records in addition to the previous records
Please suggest if i can do something
Hi,
Please find below some of the performance optimization techniques which may help u...
1) Increase the parallel processing during extraction.
2) Selective loading.
3) Every job will be having a priority (A, B, C u2013 A being the highest and C being the lowest), choose this based on your scenario.
4) Check with BASIS for the sizing of your server.
5) We can increase the number of background processes during data loads. This can be done by making dialog processes as Background processes.
For this you need BASIS inputs. (This is done in some profile settings by making the system behave differently during loads. (something like day mode/night mode))
6) There are some maintenance jobs that should run regularly in any SAP box to ensure proper functioning. Check them.
7) Use of start routines is preferred instead of update routines.
Regards,
KK.
Similar Messages
-
Hi,
I have a cube with following dimension information and it requires optimization for data load, its data is cleared and loaded every week from SQL data source using load rule. It loads 35 million records and the load is so slow that only for data load excluding calculation takes 10 hrs. Is it common? Is there any change in the structure I need to make the load faster like changing the Measures to sparse or change the position of dimensions. Also the block size is large, 52920 B thats kind of absurd. I have also the cache settings below so please look at it please give me suggestions on this
MEASURE Dense Accounts 245 (No. Of Members)
PERIOD Dense Time 27
CALC Sparse None 1
SCENARIO Sparse None 7
GEO_NM Sparse None 50
PRODUCT Sparse None 8416
CAMPAIGN Sparse None 35
SEGMENT Sparse None 32
Cache settings :
Index Cache setting : 1024
Index Cache Current Value : 1024
Data File Cache Setting : 32768
Data file Cache Current Value : 0
Data Cache Setting : 3072
Data Cache Current Value : 3049
I would appreciate any help on this. Thanks!10 hrs is not acceptable even for that many rows. For my discussion, I'll assume a BSO cube,
There are a few things to consider
First what is the order of the columns in your load rule? Can you post the SQL? IS the sql sorted as it comes in? Optimal for a load would be to have your sparse dimensions first followed by the dense dimensions(preferably having one of the dense dimensiosn as columns instead of rows) For example your periods going across like Jan, Feb, Mar, etc
Second, Do you have parallel data loading turned on? Look in the config for Dlthreadsprepare and DLthreadswrite. My multithreading you can get better throughput
Third, how does the data get loaded? Is there any summation of data before being loaded or do you have the load rule set to addative. doing the summation in SQL would spead things up a lot since each block would only get hit once.
I have also seen network issues cause this as transferring this many rows would be slow ( as KRishna said) and have seen where the number of joins done on the SQL caused massive delays in preparing the data. Out of interest, how long does the actual query take if you are just executing it from a SQL tool. -
CPU Work load optimization in SAP HANA
Hi All,
In virtual SAP HANA , there is an option to assign/optimize the vCPU to corresponds to Physical CPU as mentioned in the document below,
.http://www.vmware.com/files/pdf/SAP_HANA_on_vmware_vSphere_best_practices_guide.pdf
But,in Physical SAP HANA Systems, are the below mentioned features available,
Can we assign dedicated CPU cores manually to a particular user/users ?
Or, Is there a way to reserve certain CPU cores for particular Application/Schema Threads/Sessions?
Thanks for your help!!
-Gayathri
Message was edited by: Julie BlaufussNope, there is no such cpu-core based allocation option available.
You can limit how many worker threads will be created, in case you need to balance the CPU usage on in a multi instance setup.
However, you don't have any control on the actual amount of CPU resource that any SAP HANA instance, let alone DB user or query has.
Comparing this with what you can do in vSphere is not suitable, as we are looking at a different level of abstraction here.
To SAP HANA the machine that vSphere emulates will have x cores and SAP HANA will use these x cores - all of them.
It's important not to forget that you have an additional layer of indirection here and things like CPU core binding can easily have negative side effects.
For the SAP HANA users it would be more interesting to have a workload management within SAP HANA that would allow to manage different requirements on responsiveness and resource usage (it's not just CPU...). And that is what the SAP HANA development is working on.
Maybe we're lucky and see some features in this direction later this year. -
Where Load Balancing Takes Place
Hi guys:
I've seen a post by Todd Little.
http://www.oracle.com/technetwork/middleware/tuxedo/overview/ld-balc-in-oracle-tux-atmi-apps-1721269.pdf
In section "Where Load Balancing Takes Place"
It said
Whereas for /WS clients, the tpcall/tpacall/tpconnect just send the service request to WSH and *do not*
*perform load balancing in /WS clients*. WSH calls native client routine to achieve the load balancing
task on behalf of /WS clients. *To achieve the load balancing between /WS clients and WSL servers,*
*multiple WSL access points can be configured by WSNADDR.* This feature can assign the /WS clients
evenly to different WSL servers to balance the work load between WSL/WSH in the system.
I'm very confused about this description. In MP (clustered) mode, Where Load Balancing takes place?? the client side or server side ? I mean, all client send request to master node, and master node dispatch the request to other slave node ??Hi,
All load balancing (or perhaps better called load optimization) occurs in the native client code where all request routing occurs. So in an MP configuration, the native client (or a handler in the case of workstation, Jolt, or IIOP clients) makes the routing decision which includes load balancing. So the client looks at the load on all the servers across the cluster and makes a routing decision taking into consideration such things as NETLOAD. Note however that before Tuxedo 12c the load information held locally for remote servers (actually queues) was never updated in realtime by the remote machines. Thus the load would increase continuously until the next BB scan at which point the BBL would zero the locally held load information for remote queues. In Tuxedo 12c with TSAM installed, the locally held load information for remote queues IS updated dynamically using a variety of techniques including piggybacking load information in reply messages and periodically sweeping load information to other machines in the cluster. The former works OK, whereas the latter works really well.
Also, the MASTER node in a cluster is basically just the boss for configuration and state changes. It has no special role in request routing or processing. In fact, if the MASTER machine dies, the cluster continues to operate just fine, but configuration and state changes such as starting or stopping servers can't occur until the MASTER is back up or migrated to the BACKUP.
Regards,
Todd Little
Oracle Tuxedo Chief Architect -
Concatenating the data file(Mbrs) before loading.. due to change in Outline
We have 8 Dimensions(11.1.1.3)
The data file is coming from the source system in this format from 3 yrs as below ..
D1 D2 D3 D4 D5 D6 Product, Curriency
. , . ,. , . , . , ., a , USD
. , . , ., . , . , . , a , EUR
. , . , . , . , . , ., b , GBP
. , . , . , . , . , . , b , INR
Now the product name as been changed in outline as
a_USD
a_EUR
b_GBP
b_INR
So, Is there any way in the hyperion suite(like in rules file or any other), where I can concatenate Product and currency and get the file as
D1 D2 D3 D4 D5 D6 , Product , Curriency
. . . . . . , a_USD , USD
. . . . . . , a_EUR , EUR
. . .. . . . , b_GBP , GBP
. . . . . . , b_INR , INR
Please do let me know
Thanks in Advance.
Edited by: 838300 on Sep 27, 2011 9:00 AMWhile what Mehmet wrote is correct, if this is anything more than a quick and dirty fix, may I suggest you go down another ETL path? (Yes, this is Cameron's load rule rant coming right at you, but in abbreviated form.)
ETL in a load rule is great to get the job done, but is a pain to maintain. You would be (unpleasantly) surprised how these have a way of growing. Have you given a thought to fixing it at the source or doing some true ETL in a tool like ODI, or staging it in SQL and doing the mods there? I know, for a simple(ish) change, that seems overkill, but load rules for the purposes of ETL are Evil.
Regards,
Cameron Lackpour
P.S. For anyone who must see me go bonkers over this, see: http://www.network54.com/Forum/58296/thread/1311019508/Data+Load+Optimization+-Headervs+Column -
How to Create a Table with Merge and partitions in HANA
Hi,
What is the best way to create a Table with MERGE and PARTITION and UNLOAD PRIORITIES.
Any body can you please give me some examples.
Regards,
DevaOk,
1) the UNLOAD PRIORITY has nothing to do with the order of data loads in your ETL process
2) Unloading of columns will happen automatically. Don't specify anything specific for the tables, then SAP HANA will take care about it
3) Not sure where you get your ideas from, but there is no need to manually "flush" tables or anything like that. SAP HANA will take care of memory housekeeping.
4) Partitioning and how to specify it for tables has been largely documented. Just read up on it.
5) Delta Merge will happen automatically, as long as you don't prevent it (e.g. by trying to outsmart the mergedog rules)
Seriously, I get the impressions that this list of requirements is based on some hear-say and lack of actual information and experience with SAP HANA. There are a couple of extensive discussions on data loading optimization available here in SCN and on SAPHANA.COM. Please read those first.
All this had been discussed broadly a couple of times.
- Lars -
BIOS update, then will not boot from any drive
This old laptop, 3000 C200 8922 was acting very clunky when on wireless.
Downloaded Bios Version 63ET62WW and flashed it into the bios.
Now, even though I can get to the bios screen, and it shows the hard drive, DVD/CD drive, it will NOT boot from either.
It will also "see" a usb installed, but will not boot from it.
I have removed all power and cleared the CMOS several times and I still have nothing. Tried two different hdd's which have been in the machine and same results.
HELP!! I should have never messed with it......hi k4pew,
One rule after a BIOS update is to Load optimize deafults after updating.
Try that go inside BIOS Hit F9 to load defaults then F10 to Save and Exit.
If not
Try turning on the computer and start hitting F1 to load the Recovery Menu and see if that will push then Try Reseting the PC.
If still the same Im afraid that the BIOS needs to be reflashed.
Im not aware of any flash tool or a your Previous BIOS Version Firmware available for download.
You can try calling Lenovo Technical Support for any options available for repair.
Im sorry to hear what happened
Regards
Solid Cruver
Did someone help you today? Press the star on the left to thank them with a Kudo!
If you find a post helpful and it answers your question, please mark it as an "Accepted Solution"! This will help the rest of the Community with similar issues identify the verified solution and benefit from it.
Follow @LenovoForums on Twitter! -
NON-transactional session bean access entity bean
We are currently profiling our product using Borland OptmizeIt tool, and we
found some interesting issues. Due to our design, we have many session beans which
are non transactional, and these session beans will access entity beans to do
the reading operations, such as getWeight, getRate, since it's read only, there
is no need to do transaction commit stuff which really takes time, this could
be seen through the profile. I know weblogic support readonly entity bean, but
it seems that it only has benefit on ejbLoad call, my test program shows that
weblogic still creates local transaction even I specified it as transaction not
supported, and Transaction.commit() will always be called in postInvoke(), from
the profile, we got that for a single method call, such as getRate(), 80% time
spent on postInvoke(), any suggestion on this? BTW, most of our entity beans are
using Exclusive lock, that's the reason that we use non-transactional session
bean to avoid dead lock problem.
ThanksSlava,
Thanks for the link, actually I read it before, and following is what I extracted
it from the doc:
<weblogic-doc>
Do not set db-is-shared to "false" if you set the entity bean's concurrency
strategy to the "Database" option. If you do, WebLogic Server will ignore the
db-is-shared setting.
</weblogic-doc>
Thanks
"Slava Imeshev" <[email protected]> wrote:
Hi Jinsong,
You may want to read this to get more detailed explanation
on db-is-shared (cache-between-transactions for 7.0):
http://e-docs.bea.com/wls/docs61/ejb/EJB_environment.html#1127563
Let me know if you have any questions.
Regards,
Slava Imeshev
"Jinsong HU" <[email protected]> wrote in message
news:[email protected]...
Thanks.
But it's still not clear to me in db-is-shared setting, if I specifiedentity
lock as database lock, I assumed db-is-shared is useless, because foreach
new
transaction, entity bean will reload data anyway. Correct me if I amwrong.
Jinsong
"Slava Imeshev" <[email protected]> wrote:
Jinsong,
See my answers inline.
"Jinsong Hu" <[email protected]> wrote in message
news:[email protected]...
Hi Slava,
Thanks for your reply, actually, I agree with you, we need to
review
our db
schema and seperate business logic to avoid db lock. I can not say,guys,
we need
to change this and that, since it's a big application and developedsince
EJB1.0
spec, I think they are afraid to do such a big change.Total rewrite is the worst thing that can happen to an app. The
better aproach would be identifying the most critical piece and
make a surgery on it.
Following are questions in my mind:
(1) I think there should be many companies using weblogic serverto
develop
large enterprise applications, I am just wondering what's the maintransaction/lock
mechanism that is used? Transional session / database lock,
db-is-shared
entity
I can't say for the whole community, as for my experience the standard
usage patthern is session fasades calling Entity EJBs while having
Required TX attribute plus plain transacted JDBC calls for bulk
reads or inserts.
is the dominant one? It seems that if you speficy database lock,
the
db-is-shared
should be true, right?Basically it's not true. One will need db-is-shared only if thereare
changes
to the database done from outside of the app server.
(2) For RO bean, if I specify read-idle-timeout to 0, it shouldonly
load
once at the first use time, right?I assume read-timeout-seconds was meant. That's right, but if
an application constantly reads new RO data, RO beans will be
constantly dropped from cache and new ones will be loaded.
You may want to looks at server console to see if there's a lot
of passivation for RO beans.
(3) For clustering part, have anyone use it in real enterpriseapplication?
My concern, since database lock is the only way to choose, how aboutthe
affect
of ejbLoad to performance, since most transactions are short live,if high
volume
transactions are in processing, I am just scared to death about
the
ejbLoad overhead.
ejbLoad is a part of bean's lifecycle, how would you be scared ofit?
If ejbLoads take too much time, it could be a good idea to profile
used SQLs. Right index optimization can make huge difference.
Also you may want cosider using CMP beans to let weblogic
take care about load optimization.
(4) If using Optimization lock, all the ejbStore need to do
version
check
or timestamp check, right? How about this overhead?As for optimistic concurrency, it performs quite well as you can
use lighter isolation levels.
HTH,
Slava Imeshev
"Jinsong Hu" <[email protected]> wrote in message
news:[email protected]...
We are using Exclusive Lock for entity bean, because of we do
not
want
to
load
data in each new transaction. If we use Database lock, that means
we
dedicate
data access calls to database, if database deadlock happens,
it's
hard
to
detect,
while using Exclusive lock, we could detect this dead lock in
container
level.
The problem is, using Exclusive concurrency mode you serialize
access to data represented by the bean. This aproach has negative
effect on ablity of application to process concurrent requests.As
a
result the app may have performance problems under load.
Actually, at the beginnning, we did use database lock and usingtransactional
The fact that you had database deadlocking issues tells that
application logic / database schema may need some review.
Normally to avoid deadlocking it's good to group database
operations mixing in updattes and inserts into one place so
that db locking sequence is not spreaded in time. Moving to
forced serialized data access just hides design/implementation
problems.
session bean, but the database dead lock and frequent ejbLoad
really
kill
us,
so we decided to move to use Exclusive lock and to avoid dead
lock,
we
change
some session bean to non-transactional.Making session beans non-transactions makes container
creating short-living transactions for each call to entity bean
methods. It's a costly process and it puts additional load to
both container and database.
We could use ReadOnly lock for some entity beans, but since weblogicserver will
always create local transaction for entity bean, and we found
transaction
commit
is expensive, I am arguing why do we need create container leveltransaction for
read only bean.First, read-only beans still need to load data. Also, you may seeRO
beans
contanly loading data if db-is-shared set to true. Other reason
can
be
that
RO semantics is not applicable the data presented by RO bean (forinstance,
you have a reporting engine that constantly produces "RO" data,
while
application-consumer of that data retrieves only new data and neverasks
for "old" data). RO beans are good when there is a relatively stable
data
accessed repeatedly for read only access.
You may want to tell us more about your app, we may be of help.
Regards,
Slava Imeshev
I will post the performance data, let's see how costful
transaction.commit
is.
"Cameron Purdy" <[email protected]> wrote:
We are currently profiling our product using Borland
OptmizeIt
tool,
and we
found some interesting issues. Due to our design, we have
many
session
beans which
are non transactional, and these session beans will access
entity
beans
to
do
the reading operations, such as getWeight, getRate, since
it's
read
only,
there
is no need to do transaction commit stuff which really takes
time,
this
could
be seen through the profile. I know weblogic support readonly
entity
bean,
but
it seems that it only has benefit on ejbLoad call, my test
program
shows
that
weblogic still creates local transaction even I specified
it
as
transaction not
supported, and Transaction.commit() will always be called
in
postInvoke(),
from
the profile, we got that for a single method call, such as
getRate(),
80%
time
spent on postInvoke(), any suggestion on this? BTW, most of
our
entity
beans are
using Exclusive lock, that's the reason that we use
non-transactional
session
bean to avoid dead lock problem.I am worried that you have made some decisions based on an improper
understand of what WebLogic is doing.
First, you say "non transactional", but from your description
you
should
have those marked as tx REQUIRED to avoid multiple transactions
(since
non-transactional just means that the database operation becomesits
own
little transaction).
Second, you say you are using exclusive lock, which you shouldonly
use
if
you are absolutely sure that you need it, (and note that it
does
not
work in
a cluster).
Peace,
Cameron Purdy
Tangosol, Inc.
http://www.tangosol.com/coherence.jsp
Tangosol Coherence: Clustered Replicated Cache for Weblogic
"Jinsong Hu" <[email protected]> wrote in message
news:[email protected]...
> -
Hi Experts,
Can anybody explains the functionality and configuration of LEO in SAP TM ?
Thanks,
ShaktiHello Shakti,
Well, I think the best for you is check the TM Load Optimization help : http://help.sap.com/saphelp_tm91/helpdata/en/d6/fa2c523d240d35e10000000a441470/content.htm?frameset=/en/bc/74d3e1349941bb951fd8c2896685fd/frameset.htm¤t_toc=/en/3d/e2a82dff9a4298ad92536c40dbf256/plain.htm&node_id=58
Regards, Marcelo Lauria -
Hello
I look for a guide or manual for the good use of the battery of my notebook (it loads and it discharges, time of load, optimization of the lifespan, etc.)
If it is in Spanish, well :-)
Thank you
Hola
Busco una guía o manual para el buen uso de la bateria de mi notebook (carga y descarga, tiempo de carga, optimización de la vida útil, etc.)
Si es en español, mejor :-)
Gracias
subject & text editedI would suggest a Battery gauge reset, but im not 100% sure if it will work out, though you can try a gauge reset, considering the age of the machine I would also suggest a battery replacement
Lets make 2015 the year of the #DO
Did I or someone help you today? Press the star on the left to thank them with a Kudo! -
No IDE device detected on p965 Platinum
Hi, I have a p965 platinum and I get the others parts for my new pc (cpu, ram and vga) first, I have to update o the last bios v1.6 to can work with the 1333 bus of the cpu (e675) I update and start ok, but now, I have to install a system, but no detect dvd or hard disk conected to IDE, I was try with a lot of combination of configuration on sata, raid, ide, enchande, disable, etc. but can't fix it. I was read a lot of people with problem like this, or umda on dvd and fix with a bios update. I been update to the last bios, but problem still no fix....any idea?...I'm looking to install windows via network, but one tiime I have windows, I need dvd anyway, so need to fix this problem
My suggestion is to start from scratch:
Unplug main A/C power connector from PSU, hit power-on button a few times to drain capacitors and Clear CMOS. After restarting load Optimizes defaults in BIOS and make sure the JMicron controller is set to RAID:
Integrated Peripherals --> Onboard RAID Controller --> IDE
Make sure that the IDE devices connected to the JMicron controller are jumpered correctly as Master and Slave.
If that does not solve your problem, I suggest you use a SATA HDD and connect it to one of the Intel SATA Ports. The JMicron controller is a tricky thing and seems to be picky with drives sometimes (especially if two drives (HDD+opt. drive) are hooked up to the IDE port. -
When to switch from LE-TRA to TM?
Hi,
When is it best to switch from the standard Logistics Execution-Transport module to Transportation Management? Where lies the cut-off point and what are the exact numbers regarding this decision? The information on the internet is not sufficient to find exact data on why to switch to TM.
Thank youHi,
SAP TM has long completely overshadowed LE-TRA in general with superior usability, process flexibility, very comprehensive transportation planning, execution and charge management and decision support capabilities. It also benefits (unlike LE-TRA) from on-going, strong solution investment.
Here some details that I screened with developers a while back - and since there are more advantages. Perhaps some of these capabilities help your decision:
TM also supports sales-order-based planning / sales order scheduling for earlier transportation planning, booking, subcontracting (LE-TRA is delivery-based only)
TM has a powerful transportation cockpit with interactive, map-based planning, transportation
network & route visualization; drag-and-drop manual planning; automatic zero-click
optimization planning; embedded transportation execution visibility
TM has manual to automated optimization planning. (LE-TRA: manual and semi-automated planning only & no optimization)
TM supports comprehensive constraint-based planning (driving times, equipment requirements, bookings, ...); improved resource allocation (according to weight, volume, transportation units,...); better stage optimization (separate pre-, main and on-carriage planning) & load optimization.
LE-TRA: simple rules-based planning i.e. same start / finish dates
LE-TRA: Route-based planning. No lane-specific planning with distances, and durations.
TM has superior mode-specific planning with mode-specific schedules (truck, ocean, air
freight & rail), mode-specific distance planning (GIS info); and freight prebooking.
TM has strong rules-based carrier determination, selection, tendering and collaboration (TM: has fully and semi-automatic rules-based carrier determination & tendering based on freight agreement, cost,priority, lane allocation, or business share; multiple tendering variants (direct, peer-to-peer, broadcasting); real-time request for quotation monitoring; carrier collaboration (TM collaboration Portal, or web-based UI & mobile application). LE-TRA: no carrier determination logic).
TM supports dynamic, responsive change management (order, delivery, shipment, execution changes)
TM has integrated freight execution visibility, tracking and event reporting
TM: map integration for freight order status, embedded SAP Event Management & Integrated Mobile Solution for carrier transport & event notification; and better
information granularity for cargo management (i.e. cargo loading status), resource tracking...
TM: Inbound & Outbound execution status management
LE-TRA: Outbound-based execution / milestone tracking
TM has enhanced freight agreement management (enhanced validtiy, version and history management,tarif structures, calculation..., as well as strategic freight procurement and selling. LE-TRA: complex table entry, less granularity, poorer version management)
TM has superiror costing & charge management (inter/intra company
costing & revenue distribribution; activity-based costing; condition-based
costing (dangerous goods, bulky goods); profitability analysis)
TM has comprehensive decision support: Dashboards with KPI reporting, embedded analytics, (LE-TRA: Logistics Information System-based > limited report availability)
Best wishes,
Natalie -
Hi algnerd,
Thank you for considering Sony Community!
My suggestion is to troubleshoot the laptop and the AC adapter first. You can save time and money by doing this. You may follow the guidelines here:
Check the AC adapter, look for possible hardware issue. The light indicator should be green.
Power cycle - remove the battery and all peripherals -> press and hold the power button for 5 seconds. Or (if the battery is not removable) remove all peripherals -> look for a battery off button on the bottom of the laptop -> press and hold for 5 seconds (you may use safety pin for that). This will drain the electricity of the computer. Charge again the laptop and test.
Reset BIOS. - While the unit is restarting, immediately keep tapping F2 key -> Once in BIOS, go to Exit tab -> select Get Default Values or Load Optimize -> hit Enter -> Save changes and exit. Restart the computer.
Download or update these drivers: Sony® Firmware Extension Parser Device Driver, Sony® Shared Library (by visiting Sony's website based on your location or from where you purchased the laptop).
Battery Care Function is a feature that limits the amount the battery will be charged in order to help extend the useful life of the battery. You may disable it to isolate the issue as well.
For further assistance, we kindly recommend visiting our Sony Global Web site for information on contacting the Sony Support Center in your region at http://www.sony.net/SonyInfo/Support/.
Hope my suggestions help.
Best regards,
Vincent
If my post answers your question, please mark it as "Accept as Solution"
Hi,
I have a Sony VPCF24P1E running windows 8.1 pro 64bits, its AC adapter is a VGP-AC19v18.
The problem is that its stops working from time to time for a couple of days (~15days)
More about the problem: for example the first time it happened, I unplugged the AC adapter and went to sleep the next morning before going out to by a new one, I've tried it and it worked perfectly.
Few days later, the same thing happened (Last Thursday (around 4AM)): tried to unplug it and plug it and it worked. The same day around 11am~12pm the same problem occurred once again and today (around 3~4AM) the same thing happened.
Currently the AC Adapter is working and charging the PC.
The question is: what is the problem? Is it solvable or should I by a new one?
PS: I'm using Battery Care option in VAIO Control Center: charging the pc only to 80% because my pc is constantly plug to power.
Thank you! -
Unlike on Windows PCs, there appears to be no Wizard or any assistance when trying to install software from CDs (eg. for printers etc) Is there any free software on the net available to download which would do the job for me?
Thanks in advance for any replieshey there STL,
it should be simple to install software:
1) load the disc
2) when it appears on your desktop, double-click it & it should open
3) there should be icons w/ titles-eg. a page & read me, compass & register, box & install
4) click on the icon you want- in this case install & it should open
5) follow the instructions that appear, these should walk you through
in most cases use the Easy Install, the software will load, optimize your system & the 'puter will tell you when it's done. quit & eject the disc.
good luck, hope this helps -
Timing setting in BIOS for Corsair DDR3 on P35D3?
I'm having problems getting my new P35D3 MS-7356 to boot correctly (see symptoms below).
MSI tech wrote: "...contact manufacture of ram for possible timing setting recommendation in bios."
I have (1) 1024MB stick of Corsair DDR3 1066MHz PC8500 Dual Channel kit, TWIN3x2048-1066C7 G Memory Retail in the #1 & and another in the #3 slot .
The BIOS is version 1.1.0.L53.
I have no experience overclocking. I've been trying to boot using the "Load Optimize Default."
Anyone know what timing setting I should be using that the tech is talking about?
Description of the problem:
Installed MOBO and
graphics card
one HDD
two MB memory
1 USB keyboard
1 USB mouse
ViewSonic 20” LCD monitor
The HDD is powered by a SATA power cable and is plugged into the board via a SATA cable.
This is what happened with the very first boot attempt:
I plugged in the power supply cord
I turned on the power supply and the red LED light on the board comes on.
I press the power-on button on the front of the case and the HDD LED and power switch LED come on.
The power supply fan, the CPU Fan, and the Sys Fan1 (case exhaust fan), start running for two seconds and then stop.
After two more seconds the fans start again and the monitor displays 4 lines of info about the nVidia card and that's it. No BIOS info.
The LCD screen turns black and reverts to "No Signal".
The fans stop for 3-5 seconds and the process repeats itself indefinitely.
If I press the CMOS button I get:
4 lines of nVidia info appear and then the MSI Logo with the option to Set BIOS followed with:
CMOS setting Wrong
Date and Time setting wrong
Press F1 to run setup
Press F2 to load default values
I press F1 and enter the time and date and Optimize for Default and press Save and Windows Vista Ultimate loads. I’m able to work in Vista for as long as I wish.
Then I try to reboot and I have to go through the above steps again, in other words the BIOS does not retain the date and time.
I booted with just one stick installed and then again with the other stick in the #1 slot and, providing I do the above mentioned CMOS reset (& re enter date and time), the PC boots into Vista.
I have tried F2 "Load default values," and Vista loads but the date and time in Vista are incorrect and then my virus software won't download the latest versions.
MSI recommended swapping out the mem, HDD, PS, CPU, and graphics card, or, taking it to a local tech but none of those are realistic options from here on the Big Isle of Hawaii.
Any suggestions will sure be appreciated.
Thanks,
KerryHi Del,
Re: “...battery.” I did check the Lithium batt and it’s new & fully charged. I did the same for the first board I received before I RMA’d it.
Re: “PSU.” It is an ULTRA X3 600W Energy Efficient Modular ULT40073 Power Supply. I’ve written to Ultra describing the problem, telling them that MSI thinks the problem might be their PSU (no reply yet).
Re: “BIOS for update.” The first board came with v. 1.0 installed. After getting the described boot problem I updated the BIOS to v1.1 and got the exact same results. The RMA’d board came with 1.1.0.L53 installed. MSI’s driver download page shows that this is their latest version. [As an aside. Forum members say that I’m the only one they know of who has this new MOBO. ZipZoomFly.com, the vendor, wrote, "You're the only one who's having this problem with this MOBO."
Re: “I would also not mix memory, either go dual channel or single 1Gb” I’m confused. I have CORSAIR 2048MB (2x1024MB) DDR3 1066MHz PC8500 Dual-Channel Kit One stick in the # 1 slot and one stick in the #3 slot. I was advised to test the mem by taking out the #3 stick and seeing if it would boot with just the one stick in the #1 slot. It did. I then exchanged the sticks, again leaving the #3 slot empty, and again I could boot but ...with both tests I had to do the same reset the CMOS procedure. This test was supposed to eliminate the ram as the problem. However, after reporting the results of the test to MSI they said to contact Corsair and ask their timing setting recommendation in bios." I’m awaiting a reply to my post on Corsair’s Forum.
Thanks,
Kerry
Maybe you are looking for
-
ADOBE 9.1.1 UPDATE FAILURE (FIX)
For those who have ERROR when trying to update to 9.1.3 First you must use the "Windows Installer Cleanup Tool" Remove all Adobe entries 7.0,8.0,9.0,Adobe.com,Adobe AIR, etc.,etc. Downloadhttp://www.ccleaner.com/ and Run the "Registry Scan" in this F
-
I just downloaded the trial of Photoshop. Every time I try to open it, it comes up with an error message and closes less than 5 seconds into being opened. Does anyone know how I can fix this? I tried rebooting my computer and reinstalling photosho
-
How to send array of bytes to a servlet and store it in a file
Hi, How to send array of bytes to a servlet and store it in a file. I am new to Servlets. if possible for any one, please provide the code. Thanks, cmbl
-
When I burn a DVD in Premiere Elements 11, the quality of the DVD is poor. The preview file looks great. The project is a mixture of SD video, HD pics and audio. What am I doing wrong?
-
Hi Experts, What are the necessary configurations required in SAP New GL for integration with MM and SD modules. We used to do using OBYC and VKOA in old GL, is it the same in New GL also ? Regards Neel