Problematic G4 Cube
The other day there was a power outage in my home for around 2 hours. My G4 400 Cube was working fine before that. Now, it won't even power on. I tried unplugging everything and tried flipping the PMU switch, but it still is dead. It won't turn on, nothing lights up, not even the "on" switch responds. What could be the problem? Do I need to buy a new Power Supply? Where can I get one? I haven't been able to find one.
I pulled the plug out of the power supply and it did have a brown stain on it...so I started to assume it was fried, however, I don't know for certain and really don't feel like spending the money on a new power supply if that's not the problem.
I'd appreciate any help!
Thanks.
Hi,
An outage or a surge??
Actually, the Cube 'brick' PSU supplies two 28v power supplies. The connector to the Cube has 4 pins, the 28v is in pairs, two on top and two on bottom, the surround shield is the ground. If you check voltages, and 28v is showing on the connector pins, then the next crucial part in the power chain is the VRM. Check the PSU first and remove and re-connect all power cord connections too. See this for Pinout:
http://cubeowner.com/kbase2/index.php?page=indexv2&id=69&c=35
A blown VRM will also exhibit zero startup power. This item separates one 28v pair into 3v, 5v & 12v for different uses in the Cube, the other 28v goes to the ADC for the Video card. A blown VRM is not good news.
Check the PSU and tell us what happens. We can take things from there.
A new Cube PSU can be found on Fleabay.
Regards,
Dave
Similar Messages
-
Data Loading Error for cube 0TCT_C22
Dear Colleagues,
I am working on BI 7.0 / SP09.
I am loading technical content cube 0TCT_C22 from datasource 0TCT_DS22. Till PSA, there is no problem with data load. But from PSA to data target, data load fails. Went to monitor and it shows teh error "Error calling number range object 0TCTPRCSCHN for dimension D2 ( ). Message no: RSAU023".
I tried to find the SAP notes fro this, but no success. Also checked the Dump and application logs, but nothing is there.
Please advice ASAP.
Regards
PSHi Pank,
I just solved the very similar issue. Try what I did to see if it works for you. For each dimension in each Infocube a number range is created. For some weird reason during the activation of the Infocube the number range for the dimension giving troubles was not created. Look for it in TCODE SNRO and you should find a number range per each dimension in the cube but the one giving you error.
To solve it, (the easiest way I found) just add any characteristic to the problematic dimension. Activate the Infocube. After that, modify again you Infocube and remove the characteristic you just added to leave the dimension how you need it. Activate the Infocube again. By doing that you will force the regeneration of the dimension and with it the number range. You can chek in TCODE SNRO and the number range should be there. Try loading your data again and it should work.
One thing I don't understand is why that number range sometimes is not created during activation
Good luck, I hope you can solve it!!!
Regards,
Raimundo Alvarez -
Hi experts,
I have the field 0fiscyear in my dso and also in the cube. But
when creating transformation the field 0fiscyear is present in
the dso but not in the cube. So that i am unable to create
transformation for 0fiscyear.Anyone please explain me what
should i do to make the field visible in the cube.
And one more question. How can we select or hide the fields
of 0fi_ar_3.This data source is not visible in LBWE.
Full points will be assigned.
Thanks & Regards,
V N.Hi VN,
Goto SBIW->Application Specific>Logistics>manage extract struvctures>initialization->appl specific setup of statistical dat -->select inventory mangement perform setup>
in the select materil movement/Setup: Invoice Verification, Revaluation as per your choice to load.
direct transaction OLI1BW for material movement
OLIZBW for invoice verification
once you gets into statistical setup give name of the run and termination time and date to setup then gofor excution
once setup tables start loading gotoRSA3 which is extrac chekor to chk u r datasource data
before all this make sure your data source as active.
i hope this is clear na
mahesh
Edited by: Mahesh Kumar on May 19, 2008 7:00 AM
Edited by: Mahesh Kumar on May 19, 2008 7:01 AM -
Can not find master data in the cube
ItemNo is an navigation attribute of 0Material. We are using 0material in a cube. When I display the cube data include ItemNo, I don't find the data I am looking for. For example:
In 0Material:
0Material ItemNo Description
AAA 001 Test
BBB 002 Test
In the cube I expect to see:
0Material ItemNo On Hand Qty
AAA 001 1000
BBB 002 2000
Instead I saw:
0Material ItemNo On Hand Qty
AAA 1000
BBB 2000
0Material got refreshed after cube data got loaded. Is that why I don't find the refreshed Master data?
Thank you!Hi,
When you laod the cube, the Values for Master are automatically get created in the master data (If they are not present earlier and with "Data Update Type in the Data Tragets" setting in InfoPackage).
Now when you are loading your cube with 0Material and Item Number, the values for Item Number seems blank. Thats the reason why you are not able to see the Item Number in Cube though it is present in Master. Because now these are two different keys, one with Item Number value and other without Item Number value.
Regards,
Yogesh. -
Index's on cubes or Aggregates on infoobjects
Hello,
Please tell me if it is possible to put index's on cubes; are they automatically added or is this something I put on them?
I do not understand index's are they like aggregates?
Need to find info that explains this.
Thanks for the hlep.
NewbieIndexes are quite different from aggregates.
An Aggregate is a slice of a cube which helps the data retrival on a faster note when a query is executed on a cube. Basically it is kind of a snapshot of KPI's and Business Indicators (Chars) which will be displayed as the initial query run result.
Index is a process which is inturn will reduce the query response time. While an object gets activated, the system automatically create primary indexes. Optionaly, you can create additional index called secondary indexes.Before loading data, it is advisable to delete the indexes and insert them back after the loading.
Indexes act like pointers for quickly geting the Data.When u delete it will delete indexes and when u create it will create the indexes.
When loading we delete Bcs during loading it has to look for existing Indexes and try to update so it will effect the Data load performence so we delete and create it will take less time when compared to updating the existing ones.
There is one more issue we have to take care if u r having more than 50 million records this is not a good practice insteah we can delete and create during week end when they r no users. -
Hi,
In my infocube material type for one of the material is not getting displayed.
When I check in the content of the cube for this material all the fileds are getting displayed except material type.
However it is present in the material master data from which it is put into the update rules to populate in the cube.
Its getting displayed for some other materials , so we cant say that mapping is wrong or problem with update rules.
Can some body let me know what could be the reason.
Thanks,
JeetuHi Jeetu,
can you check in your cube if you have for one material, entries with AND entries without the MATL_TYPE? If this is the case then you were loading transactional data before having the corresponding material master data.
You should adapt your scenario:
- first do not use the standard attribute derivation during your URules: performance is very bad.
- implement a start routine filling an internal table with your material and MATL_TYPE for all entries of material in your datapackage.
- implement an update routine on the MATL_TYPE with a READ on this internal table an raise an ABORT = 4 if the MATL_TYPE is initial or the material in not found.
Now to fix your situation you'll have to reload your cube or alternatively just reload your missing MATL_TYPE MATERIAL from your cube itself and selective delete those which are empty.
hope this helps...
Olivier. -
Amount not getting displayed in the cube
Hi,
I have a ZTable which has got an Amount field declared like this.
AMOUNT WAERS CUKY 5 0 Currency Key
And I declared a ZAMOUNT as Amount in the cube and provided 0CURRENCY.
Initially there was no value in the Amount column in the Ztable and hence could not see any value of Amount in the cube.But after entering values in the ZTable, I deleted the replicated the DS, create TR, UR and then create an InfoPckg and then created the Data Transfer Process. But still I am not able to see the Amount Value in the cube.
Kindly Help
Sam..> hi Sam,
>
> did you assign AMOUNT field from ztable to infoobject
> ZAMOUNT in transfer rules ? move AMOUNT from left to
> right in tab 'datasource./transf.rules ?
> and ZAMOUNT mapped ? (with ZAMOUNT)
>
> hope this helps.
Hi Edwin,
Assigning and mapping which you mentioned is just pulling an arrow from the left Amount to right ZAmount, right? I did that. But there is an additional 0Currency on the right side without getting mapped? Should I map that also to Amount? Any issue with Rules.
Regards,
Sam... -
Hi experts,
I am using the cube 0pur_c01 which is loaded from 3 datasources 2lis_02_itm, 2lis_02_scl, 2lis_02_s012.
But there are only few fields in this standard cube. I want to add some more fields.
Can anyone please suggest me what are the major fields for Purchasing data. So that i will include them into my infocube.
Regards,
Bhadri M.Hi,
We have modified our cube to contain the following key figs and charracteristics. These are available by modifying the standard datasource by changing it in LBWE and pulling in additional fields available for selection.
1. Characteristics which are of use are as follows:
Calendar Day
Calendar Year/Month
Calendar Year/Week
Fiscal year / period
Fiscal year variant
Calendar Year/Quarter
Base Unit of Measure
Local currency
Country key
Company code
Number of purchasing info record
Purchasing info record category
Material
Valuation type
Product Description
Material group
Purchasing organization
Indicator: Data to Be Canceled
Vendor
Version
Value Type for Reporting
Flag for Contracts
Plant
Storage location
Supplying Plant
PO Number
PO Line Number
Purchasing document type
Puchasing document category
"Delivery Completed" Indicator
Item Category in Purchasing Document
Reason for Ordering
Acct Assignment Category
Control indicator for purchasing document type
Confirmation control key
Tax on sales/purchases code
Shipping conditions
Purchasing group
2. Key Figures of use:
Number of deliveries
Delivery Date Variance 1
Delivery Date Variance 2
Delivery Date Variance 3
Delivery Date Variance 4
Delivery Date Variance 5
Delivery quantity variance 1
Delivery quantity variance 2
Delivery quantity variance 3
Delivery quantity variance 4
Delivery quantity variance 5
Invoice amount: Returns
Weighted total delivery time
Effective order value of returns
Value of goods received in local currency
Goods receipt value as at posting date
Invoice Receipt Quantity as at Posting Date
Invoice Amount as at Posting Date
Actual goods receipt quantity
Goods receipt quantity of returns
Goods receipt qty in base unit (calculate wtd.delivery time)
GR value: Returns as at posting date
Invoiced amount
Invoice receipt quantity of returns
IR quantity: Returns as at posting date
IR value: Returns as at posting date
Invoice receipt quantity
Effective purchase order value
Target delivery quantity
Order quantity (returns)
Number of contract items
Number of scheduling agreement schedule lines
Number of purchase order schedule lines
Number of purchase order items
Order quantity
Number of quotation items
Number of request for quotation items
Number of scheduling agreement items
Total delivery time in days
Net Purchase Order Value
Purchase Main IV Value
Cheers... -
Hi Experts
Here client wants 200 fields in cube
whether that is suggestable help me in this
Regards
AnandHi,
Its possible to have that many fields in the cube, but you have to consider what the reporting requirements are and how the data is connected, etc. You also have to consider how much data volume is going to be generated and at what level you are going to store the data in the cube. If your client is suggesting you to have line item level or granular data in your cube , suggest to them to have that in an ODS and provide jump queries from the cube to the ODS if necessary.
If many of the fields are not going to be required for reporting at present or in the future, suggest to the client to have them in the ODS and a smaller subset of the fields in the cubes.
Cheers,
Kedar -
Issue when uploading Sales data from DSO to Cube.
Dear All,
I have an issue when I am uploading Sales data from DSO to Cube. I am using BI 7.0 and I have uploaded all sales document level data to my DSO. Then I use a transformation rule to calculate the Sales Value when I am doing DTP to cube. Cube has customer wise aggregation data.
In DSO I have NetPrice(KF) and Delivered_QTY(KF). I do a simple multiplication routine in the transformation from DSO to Cube.
RESULT = SOURCE_FIELDS-NET_PRICE * SOURCE_FIELDS-DLV_QTY .
At the moment I use Active Table(With out Archive) on the DSO to get the data since this is my first load.
Issue is that the figure (Sales Value) in the cube is incorrect. I am getting a very large values which is impossible.
Can someone please help me.
ShankaHi,
are you sure that cube has customer wise aggregated data..since it will always aggregate the values of the key figures for the same set of characters...
did you checked the values for the key figures as well and if even they are inflated or is the problem with this key figure only.
During the data load the records may aggregates first for the same values of characters and then multiplication will be happening....if that is the case then you may have to multiply the values before storing it in the data package and then let it aggregate....can be achived through start routine...
but first verify if other key figures are also having the same issue.
Thanks
Ajeet -
CUBE Not getting All records from DSO
Hi Experts ,
We have a situation where we have to load data from a DSO to a Cube . The DSO contains only 9 records and while we r loading the data into cube , the cube is only getting 2 records . Thus 7 records are missing. Also The Cube contains more Fields than DSO. in the transformations , for the extra fields we have written end routine and those extra fields will get data by reading master data .. Any pointers how to get the missing records or what is the error ...
SamWhy multiple threads ????
-
Impact if FIGL10 DSO is not staged b4 FIGL10 Cube
Heallo SDNer's,
How wud it impact if I dont stage 0FIGL_O10 - DSO between 0FIGL_C10 ?
Structures wud be similar - no additional fields in Cube.
Even SAP defined DSO n Cube have similar fields.
Datasource is <b>delta capable</b> - 0FI_GL_10.
Some background - > After having enhanced FIGL10 DSO with some fields which do exist in the datasource I have to have 19Keys to determine the unique records or to pull all records from PSA w/o any records being overwritten.
However, datasource relevant OSS note sugggested to take d help of artificial keys ( for the same DSO...here 1 artificial key will concatinate 4 keys ) .
<b>Data volumes will be less - 200thousands a month.</b>
Wud SAP suggested method ( artificial keys ) be a better bet ?
Please share your thoughts.....any inputs wud really helpJr Roberto,
You can do this. In BW system.
1. Go to table RSOLTPSOURCE.
2. Enter 0FI_GL_10 as datasource.
3. Check the field delta process value, refer that value to table RODELTAM.
That will tell you what kind of data is coming from that extractor. Based on the delta data (After image, before image and after image, etc), you can decide whether you need DSO or not.
-Saket -
Data not matching in cube which gets from two DSOs
Hi All,
I have a requirement.
Cube gets from data from 2 DSOs.
In 1 DSO -
There are three fields which are mapped to one field in Cube
2 DSO--There are also similar type.
Like 4 fields in dso2 mapping to 1 field in cube
When the data is loaded into Cube,
the data is shown as split data like DSO 1 data and DSO2 data.
I want them combined.
Can anyone let me know what can I do to show as combined data.
Thanks In advance!Thanks to both of you.
I am different characterstics in DSO1 and DSO2 to cube.
Ofcourse 1or 2 characterstics are common in both DSOs.
Employee LNum deptid frname compcode
12 VBNM W123
13 KNML W145
15 345 K45
16 864 L89
So I am able to see as Split data.12,13 employee IDs are from DSO1 and 15,16 r from DSO2.
even if I give sam eemploee numbers also, same type of split data appears.
Please suggest tocombine both DSOs data in the cube.
Thanks In advance! -
Added new field to cube but data not passed from DSO
Hope someone can help.
(BI 7.0) We added new fields into a cube. The fields already existed in the DSO. When we ran the process chain in development for the first time after making this change, we notice that the 'historical' data for these fields is populated in the cube. When we do this in our quality system only new data passed to the cube is updated. In development in the sub-chain DTP request we see all previous requests listed under selections. In quality it is only the latest request. The only difference is that the DeltaInit flag in the DTP request in development is ticked (extraction mode) - but not in quality. Does anyone know whyb this is?hi peter,
Adding fields to cube doesn't affect delta status...The delta DTP should be able to handle delta requests automatically.
I guess in you quality system, the cube already got all requests updated from ODS before you importing the change request. And in develop system none of the requests in ODS were updated before the change.
Regards,
Frank -
Hi Gurus
I am working on BI 7 and as per a requirement i had to add 2 fields i.e calmonth(0CALMONTH) & a key figure namely document number (0CRM_NUMDOC) to a cube, now in tranformation calmonth is being mapped to a field in the DSO name of the field being, date field(0CRM_CRD_AT) and the key figure is being mapped with field of the same name in the DSO.After adding the fields to the cube and performing the full load i do not see the result for all records of the 2 fields that were loaded in the cube.kindly suggest..points will be hansomely awardedHi Sharma,
As u said the fields to be added to the cude, in this case the DS is already having this fileds and not mapped at the intal satage of mapping....????
Can you check in RSA3 whether these fields that you have added does contain the Data for this perticluar Datasource.
If you have enhanced the data source then u have to chek at the source system level itself.
Since you are using BI.7 the DTP method then there is only 1 place that you map is in Tranformations.
In this place it is direct map and 0CALMONTH if you map to any data field it will take the conversion automatically.
Same way the Key field also if it is a direct field then it has to come.
Best Regards,
VNK.
Maybe you are looking for
-
Dunning error in case of customer creation
Hi, While Creating of a new customer under this sales org - We got the error about Dunning not being set up . What could be the missing setting in this case?
-
I am playing max go. I get sound but no picture
I am trying to play max go on apple tv. I get sound but no picture What do I need to adjust???
-
Hello, I am trying to install an add-on package lme4 into the R distribution and got the following error: * installing to library '/usr/lib/R/library' * installing *source* package 'lme4' ... ** package 'lme4' successfully unpacked and MD5 sums check
-
How to assigne Java thread to a specific cpu core
Hello everyone, I want to ask a complicated question. How can I assign Java threads to specific core in a multi-threaded application. The underlyinsg OS can be Linux or Windows. Is there any option provided in JVM can do it? If I want to fork a proce
-
Can someone please help me with resizing an applet. I can't figure out how to decrease or increase the applet more than once. If the "smaller" button is clicked, the applet decreases in size only once. If the "bigger" button is clicked, the applet in