XML Data Load into releational structures
Hi,
I am very unexperienced in using XML and have the problem
to import very large XML data files into existing reletional structures.
In our production DB we don't use the java engine, so
that PL/SQL an the SQL Loader are the only available ways to import the data.
At the moment we get flat files and use the SQL Loader utility. But an interface to a new system send XML data now and I have to fill the same old releational structure with the new data.
Can anybody give me a hint about the best technic for an high performance import. Are there any existing tools for the relational mapping?
Regards Ralph
Thank you for your reply.
You are right. We only want to break the XML to fill our relational structures. We don't need the XML data further on. But we have to load the data in temporary structures, because we have to transform the data in our own format. (The system which delivers the XML data is external and uses another data model)
Is there no more elegant way with use of databse built in technics? The XML data we get can be validated against a XML schema.
So I thought, it could be a way to load the XML in the XDB and register the schema in the database. After that store the XML data in the default generated object relational structures and then programm the data transformation and the data flow between these default structures to our target data structures with PL/SQL.
I don't know if this way is performant enough.
If I use an external tool i have to code the relational mapping outside the database and insert the data with use of ODBC in temporary structures which i have to create manualy.
So I hoped to find a way to load the data in any relational structure using the advantages of XML and XML schema and code the neccasary logic inside the DB.
Do you have any further hints for my problem?
Regards Ralph
Similar Messages
-
Importing XML Data Back into the Form
I have a form that shows several subforms based on the selections the user has made while filling in the form. This is working quite well but when I import the XML data back into the form it doesn't show the subforms that have been used.
Is there an easy way to change this?
Thanks in advance!
EmmaActually the issue may actually have to do with the fact that the connections aren't bound, but I haven't seen the data.
I have some fairly complex forms that include both subforms and instances, have an XSD embedded and export as XML. When I import the data, everythi
Now, that being said...
Are your subforms "hidden" and you opt to display them upon selection of a radio button for example, or do you SetInstances()? If you're using visible=TRUE or FALSE, that may also cause some issues.
Try this -- on Form:ready try this code:
if(this.rawValue == "on"){ //this radio button 1
_subform1.setInstances(1);
_subform2.setInstances(0);
_subform3.setInstances(0);
else if(this.rawValue == "on"){ //this radio button 1
_subform1.setInstances(0);
_subform2.setInstances(1);
_subform3.setInstances(0);
else if(this.rawValue == "on"){ //this radio button 1
_subform1.setInstances(0);
_subform2.setInstances(0);
_subform3.setInstances(1);
else { // this is fisrt time open -- i sometimes had issues with subforms being visible on first entry
_subform1.setInstances(0);
_subform2.setInstances(0);
_subform3.setInstances(0);
Then on:Click essentially copy most of the code you put in form:ready
if(this.rawValue == "on"){ //this radio button 1
_subform1.setInstances(1);
_subform2.setInstances(0);
_subform3.setInstances(0);
else if(this.rawValue == "on"){ //this radio button 1
_subform1.setInstances(0);
_subform2.setInstances(1);
_subform3.setInstances(0);
else if(this.rawValue == "on"){ //this radio button 1
_subform1.setInstances(0);
_subform2.setInstances(0);
_subform3.setInstances(1);
Of course this will go on top of your radio button group.
If you are exporting to XML, it will make your life a whole lot easier, by the way, to import an XSD and bind your nodes, especially as your forms and data start to get more complex.
Finally, you may also know this but -- unless you have Forms Server, any user that wants to export the data or import the data will need to have at least full Acrobat Professional. If you want people to be able to save data in the form but import/export isn't that important, they will need to have full Acrobat.
I hope that helps a bit. Good luck!
Lisa -
Adding leading zeros before data loaded into DSO
Hi
In below PROD_ID... In some ID leading zeros are missing before data loaded into BI from SRM into PROD_ID. Data type is character. If leading zeros are missing then data activation of DSO is failed due to missing zeros and have to manually add them in PSA table. I want to add leading zeros if they're missing before data loaded into DSO.... total character length is 40.. so e.g. if character is 1502 then there should be 36 zeros before it and if character is 265721 then there should be 34 zeros. Only two type of character is coming either length is 4 or 6 so there will be always need to 34 or 36 zeros in front of them if zeros are missing.
Can we use CONVERSION_EXIT_ALPHPA_INPUT functional module ? As this is char so I'm not sure how to use in that case.. Do need to convert it first integer?
Can someone please give me sample code? We're using BW 3.5 data flow to load data into DSO.... please give sample code and where need to write code either in rule type or in start routine...Hi,
Can you check at info object level, what kind of conversion routine it used by.
Use T code - RSD1, enter your info object and display it.
Even at data source level also you can see external/internal format what it maintained.
if your info object was using ALPHA conversion then it will have leading 0s automatically.
Can you check from source how its coming, check at RSA3.
if your receiving this issue for records only then you need to check those records.
Thanks -
How can i add the dimensions and data loading into planning apllications?
Now please let me know how can i add the dimensions and data loading into planning apllication without manuallly?
you can use tools like ODI or DIM or HAL to load metadata & data into planning applications.
The data load can be done at the Essbase end using rules file. But metadata changes should flow from planning to essbase through any of above mentioned tools and also there are many other way to achieve the same.
- Krish -
Data load into SAP ECC from Non SAP system
Hi Experts,
I am very new to BODS and I have want to load historical data from non SAP source system into SAP R/3 tables like VBAK,VBAP using BODS, Can you please provide steps/documents or guidelines on how to achieve this.
Regards,
MonilHi
In order to load into SAP you have the following options
1. Use IDocs. There are several standard IDocs in ECC for specific objects (MATMAS for materials, DEBMAS for customers, etc., ) You can generate and send IDocs as messages to the SAP Target using BODS.
2. Use LSMW programs to load into SAP Target. These programs will require input files generated in specific layouts generated using BODS.
3. Direct Input - The direct input method is to write ABAP programs targetting on specific tables. This approach is very complex and hence a lot of thought process needs to be applied.
The OSS Notes supplied in previous messages are all excellent guidance to steer you in the right direction on the choice of load, etc.,
However, the data load into SAP needs to be object specific. So targetting merely the sales tables will not help as the sales document data held in VBAK and VBAP tables you mentioned are related to Articles. These tables will hold sales document data for already created articles. So if you want to specifically target these tables, then you may need to prepare an LSMW program for the purpose.
To answer your question on whether it is possible to load objects like Materials, customers, vendors etc using BODS, it is yes you can.
Below is a standard list of IDocs that you can use for this purpose to load into SAP ECC system from a non SAP system.
Customer Master - DEBMAS
Article Master - ARTMAS
Material Master - MATMAS
Vendor Master - CREMAS
Purchase Info Records (PIR) - INFREC
The list is endless.........
In order to achieve this, you will need to get the functional design consultants to provide ETL mapping for the legacy data to IDoc target schema and fields (better to ahve sa tech table names and fields too). You should then prepare the data after putting it through the standard check table validations for each object along with any business specific conversion rules and validations applied. Having prepared this data, you can either generate flat file output for load into SAP using LSMW programs or generate IDoc messages to the target SAPsystem.
If you are going to post IDocs directly into SAP target using BODS, you will need to create a partner profile for BODS to send IDocs and define the IDocs you need as inbound IDocs. There are few more setings like RFC connectivity, authorizations etc, in order for BODS to successfully send IDocs into the SAP Target.
Do let me know if you need more info on any specific queries or issues you may encounter.
kind regards
Raghu -
Aggregating data loaded into different hierarchy levels
I have some problems when i try to aggregate a variable called PRUEBA2_IMPORTE dimensinated by time dimension (parent-child type).
I read the help in DML Reference of the OLAP Worksheet and it said the follow:
When data is loaded into dimension values that are at different levels of a hierarchy, then you need to be careful in how you set status in the PRECOMPUTE clause in a RELATION statement in your aggregation specification. Suppose that a time dimension has a hierarchy with three levels: months aggregate into quarters, and quarters aggregate into years. Some data is loaded into month dimension values, while other data is loaded into quarter dimension values. For example, Q1 is the parent of January, February, and March. Data for March is loaded into the March dimension value. But the sum of data for January and February is loaded directly into the Q1 dimension value. In fact, the January and February dimension values contain NA values instead of data. Your goal is to add the data in March to the data in Q1. When you attempt to aggregate January, February, and March into Q1, the data in March will simply replace the data in Q1. When this happens, Q1 will only contain the March data instead of the sum of January, February, and March. To aggregate data that is loaded into different levels of a hierarchy, create a valueset for only those dimension values that contain data. DEFINE all_but_q4 VALUESET time
LIMIT all_but_q4 TO ALL
LIMIT all_but_q4 REMOVE 'Q4'
Within the aggregation specification, use that valueset to specify that the detail-level data should be added to the data that already exists in its parent, Q1, as shown in the following statement. RELATION time.r PRECOMPUTE (all_but_q4)
How to do it this for more than one dimension?
Above i wrote my case of study:
DEFINE T_TIME DIMENSION TEXT
T_TIME
200401
200402
200403
200404
200405
200406
200407
200408
200409
200410
200411
2004
200412
200501
200502
200503
200504
200505
200506
200507
200508
200509
200510
200511
2005
200512
DEFINE T_TIME_PARENTREL RELATION T_TIME <T_TIME T_TIME_HIERLIST>
-----------T_TIME_HIERLIST-------------
T_TIME H_TIME
200401 2004
200402 2004
200403 2004
200404 2004
200405 2004
200406 2004
200407 2004
200408 2004
200409 2004
200410 2004
200411 2004
2004 NA
200412 2004
200501 2005
200502 2005
200503 2005
200504 2005
200505 2005
200506 2005
200507 2005
200508 2005
200509 2005
200510 2005
200511 2005
2005 NA
200512 2005
DEFINE PRUEBA2_IMPORTE FORMULA DECIMAL <T_TIME>
EQ -
aggregate(this_aw!PRUEBA2_IMPORTE_STORED using this_aw!OBJ262568349 -
COUNTVAR this_aw!PRUEBA2_IMPORTE_COUNTVAR)
T_TIME PRUEBA2_IMPORTE
200401 NA
200402 NA
200403 2,00
200404 2,00
200405 NA
200406 NA
200407 NA
200408 NA
200409 NA
200410 NA
200411 NA
2004 4,00 ---> here its right!! but...
200412 NA
200501 5,00
200502 15,00
200503 NA
200504 NA
200505 NA
200506 NA
200507 NA
200508 NA
200509 NA
200510 NA
200511 NA
2005 10,00 ---> here must be 30,00 not 10,00
200512 NA
DEFINE PRUEBA2_IMPORTE_STORED VARIABLE DECIMAL <T_TIME>
T_TIME PRUEBA2_IMPORTE_STORED
200401 NA
200402 NA
200403 NA
200404 NA
200405 NA
200406 NA
200407 NA
200408 NA
200409 NA
200410 NA
200411 NA
2004 NA
200412 NA
200501 5,00
200502 15,00
200503 NA
200504 NA
200505 NA
200506 NA
200507 NA
200508 NA
200509 NA
200510 NA
200511 NA
2005 10,00
200512 NA
DEFINE OBJ262568349 AGGMAP
AGGMAP
RELATION this_aw!T_TIME_PARENTREL(this_aw!T_TIME_AGGRHIER_VSET1) PRECOMPUTE(this_aw!T_TIME_AGGRDIM_VSET1) OPERATOR SUM -
args DIVIDEBYZERO YES DECIMALOVERFLOW YES NASKIP YES
AGGINDEX NO
CACHE NONE
END
DEFINE T_TIME_AGGRHIER_VSET1 VALUESET T_TIME_HIERLIST
T_TIME_AGGRHIER_VSET1 = (H_TIME)
DEFINE T_TIME_AGGRDIM_VSET1 VALUESET T_TIME
T_TIME_AGGRDIM_VSET1 = (2005)
Regards,
Mel.Mel,
There are several different types of "data loaded into different hierarchy levels" and the aproach to solving the issue is different depending on the needs of the application.
1. Data is loaded symmetrically at uniform mixed levels. Example would include loading data at "quarter" in historical years, but at "month" in the current year, it does /not/ include data loaded at both quarter and month within the same calendar period.
= solved by the setting of status, or in 10.2 or later with the load_status clause of the aggmap.
2. Data is loaded at both a detail level and it's ancestor, as in your example case.
= the aggregate command overwrites aggregate values based on the values of the children, this is the only repeatable thing that it can do. The recomended way to solve this problem is to create 'self' nodes in the hierarchy representing the data loaded at the aggregate level, which is then added as one of the children of the aggregate node. This enables repeatable calculation as well as auditability of the resultant value.
Also note the difference in behavior between the aggregate command and the aggregate function. In your example the aggregate function looks at '2005', finds a value and returns it for a result of 10, the aggregate command would recalculate based on january and february for a result of 20.
To solve your usage case I would suggest a hierarchy that looks more like this:
DEFINE T_TIME_PARENTREL RELATION T_TIME <T_TIME T_TIME_HIERLIST>
-----------T_TIME_HIERLIST-------------
T_TIME H_TIME
200401 2004
200402 2004
200403 2004
200404 2004
200405 2004
200406 2004
200407 2004
200408 2004
200409 2004
200410 2004
200411 2004
200412 2004
2004_SELF 2004
2004 NA
200501 2005
200502 2005
200503 2005
200504 2005
200505 2005
200506 2005
200507 2005
200508 2005
200509 2005
200510 2005
200511 2005
200512 2005
2005_SELF 2005
2005 NA
Resulting in the following cube:
T_TIME PRUEBA2_IMPORTE
200401 NA
200402 NA
200403 2,00
200404 2,00
200405 NA
200406 NA
200407 NA
200408 NA
200409 NA
200410 NA
200411 NA
200412 NA
2004_SELF NA
2004 4,00
200501 5,00
200502 15,00
200503 NA
200504 NA
200505 NA
200506 NA
200507 NA
200508 NA
200509 NA
200510 NA
200511 NA
200512 NA
2005_SELF 10,00
2005 30,00
3. Data is loaded at a level based upon another dimension; for example product being loaded at 'UPC' in EMEA, but at 'BRAND' in APAC.
= this can currently only be solved by issuing multiple aggregate commands to aggregate the different regions with different input status, which unfortunately means that it is not compatable with compressed composites. We will likely add better support for this case in future releases.
4. Data is loaded at both an aggregate level and a detail level, but the calculation is more complicated than a simple SUM operator.
= often requires the use of ALLOCATE in order to push the data to the leaves in order to correctly calculate the aggregate values during aggregation. -
How to delete the data loaded into MySQL target table using Scripts
Hi Experts
I created a Job with a validation transformation. If the Validation was failed the data passed the validation will be loaded into Pass table and the data failed will be loaded into failed table.
My requirement was if the data was loaded into Failed database table then i have to delete the data loaded into the Passed table using Script.
But in the script i have written the code as
sql('database','delete from <tablename>');
but as it is an SQL Query execution it is rising exception for the query.
How can i delete the data loaded into MySQL Target table using scripts.
Please guide me for this error
Thanks in Advance
PrasannaKumarHi Dirk Venken
I got the Solution, the mistake i did was the query is not correct regarding MySQL.
sql('MySQL', 'truncate world.customer_salesfact_details')
error query
sql('MySQL', 'delete table world.customer_salesfact_details')
Thanks for your concern
PrasannaKumar -
How to make data loaded into cube NOT ready for reporting
Hi Gurus: Is there a way by which data loaded into cube, can be made NOT available for reporting.
Please suggest. <removed>
ThanksSee, by default a request that has been loaded to a cube will be available for reporting. Bow if you have an aggregate, the system needs this new request to be rolled up to the aggregate as well, before it is available for reporting...reason? Becasue we just write queries for the cube, and not for the aggregate, so you only know if a query will hit a particular aggregate at its runtime. Which means that if a query gets data from the aggregate or the cube, it should ultimately get the same data in both the cases. Now if a request is added to the cube, but not to the aggregate, then there will be different data in both these objects. The system takes the safer route of not making the 'unrolled' up data visible at all, rather than having inconsistent data.
Hope this helps... -
AWM Newbie Question: How to filter data loaded into cubes/dimensions?
Hi,
I am trying to filter the amount of data loaded into my dimensions in AWM (e.g., I only want to load like 1-2 years worth of data for development purposes). I can't seem to find a place in AWM where you can specify a WHERE clause...is there something else I must do to filter data?
ThanksHi there,
Which release of Oracle OLAP are you using? 10g? 11g?
You can use database views to filter your dimension and cube data and then map these in AWM
Thanks,
Stuart Bunby
OLAP Blog: http://oracleOLAP.blogspot.com
OLAP Wiki: http://wiki.oracle.com/page/Oracle+OLAP+Option
OLAP on OTN: http://www.oracle.com/technology/products/bi/olap/index.html
DW on OTN : http://www.oracle.com/technology/products/bi/db/11g/index.html -
Xml data upload into tables using loader
Hi,
I have to load XML file data into multiple tables using sqlloader. i wrote a cotrol file to execute this ,but i was not able to load the data into multiple tables at the same time, first loading table in control file is able to load the data ,rest of the load is giving error,please refer the below control file and log. Help me with your great efforts.
Have a great day!
Control file:
LOAD DATA
TRUNCATE
INTO TABLE Derivative_Security
WHEN DERIVATIVESECURITYID != ' '
FIELDS TERMINATED BY '</DerivativeSecurity>' optionally enclosed by '"'
TRAILING NULLCOLS
DUMMY1 filler char(50000) Terminated by '<DerivativeSecurity ',
DUMMY2 filler char(200) Terminated by WHITESPACE enclosed by 'FAS157Level="' and '"',
DUMMY3 filler char(200) Terminated by WHITESPACE enclosed by 'FAS157MVAdjustable="' and '"',
DUMMY4 filler char(200) Terminated by WHITESPACE enclosed by 'OriginalMV="' and '"',
DUMMY5 filler char(200) Terminated by WHITESPACE enclosed by 'FAS157MVDelta="' and '"',
DUMMY6 filler char(200) Terminated by WHITESPACE enclosed by 'FAS157AdjustedMV="' and '"',
DUMMY7 filler char(200) Terminated by WHITESPACE enclosed by 'PLCurrency="' and '"',
DERIVATIVESECURITYID Terminated by WHITESPACE enclosed by 'DerivativeSecurityID="' and '"',
METDERIVATIVEID Terminated by WHITESPACE enclosed by 'MetDerivativeID="' and '"',
MUREXTRANSNUMBER Terminated by WHITESPACE enclosed by 'MurexTransactionNumber="' and '"',
DUMMY8 filler char(200) Terminated by WHITESPACE enclosed by 'Trader="' and '"',
DUMMY9 filler char(200) Terminated by WHITESPACE enclosed by 'BuySell="' and '"',
DETAILTYPE Terminated by WHITESPACE enclosed by 'DetailType="' and '"',
DERIVATIVETYPE Terminated by WHITESPACE enclosed by 'DerivativeType="' and '"',
AL_MANAGEMENTSIDE Terminated by WHITESPACE enclosed by 'AL_ManagementSide="' and '"',
COUNTERPARTYCODE Terminated by WHITESPACE enclosed by 'CounterpartyCode="' and '"',
COUNTERPARTYNAME Terminated by WHITESPACE enclosed by 'CounterpartyName="' and '"',
CURRENCYCODE Terminated by WHITESPACE enclosed by 'CurrencyCode="' and '"',
DUMMY10 filler char(200) Terminated by WHITESPACE enclosed by 'Coupon="' and '"',
DUMMY11 filler char(200) Terminated by WHITESPACE enclosed by 'FixedFloatingIndicator="' and '"',
DUMMY12 filler char(200) Terminated by WHITESPACE enclosed by 'IndexMultiplier="' and '"',
DUMMY13 filler char(200) Terminated by WHITESPACE enclosed by 'IndexName="' and '"',
DUMMY42 filler char(200) Terminated by WHITESPACE enclosed by 'Margin="' and '"',
DUMMY14 filler char(200) Terminated by WHITESPACE enclosed by 'Comment11="' and '"',
DUMMY15 filler char(200) Terminated by WHITESPACE enclosed by 'Comment12="' and '"',
DUMMY16 filler char(200) Terminated by WHITESPACE enclosed by 'Comment13="' and '"',
DUMMY17 filler char(200) Terminated by WHITESPACE enclosed by 'Comment21="' and '"',
DUMMY18 filler char(200) Terminated by WHITESPACE enclosed by 'Comment22="' and '"',
DUMMY19 filler char(200) Terminated by WHITESPACE enclosed by 'Comment23="' and '"',
DUMMY20 filler char(2000) Terminated by WHITESPACE enclosed by 'TradeComment="' and '"',
DUMMY21 filler char(200) Terminated by WHITESPACE enclosed by 'OptionCallPut="' and '"',
DUMMY22 filler char(200) Terminated by WHITESPACE enclosed by 'OptionType="' and '"',
DUMMY23 filler char(200) Terminated by WHITESPACE enclosed by 'SettleType="' and '"',
DUMMY24 filler char(200) Terminated by WHITESPACE enclosed by 'RefISIN="' and '"',
DUMMY25 filler char(200) Terminated by WHITESPACE enclosed by 'RefObligation="' and '"',
DUMMY26 filler char(200) Terminated by WHITESPACE enclosed by 'Sensitivity="' and '"',
DUMMY27 filler char(200) Terminated by WHITESPACE enclosed by 'EffectiveConvexity="' and '"',
DUMMY28 filler char(200) Terminated by WHITESPACE enclosed by 'Vega="' and '"',
DUMMY29 filler char(200) Terminated by WHITESPACE enclosed by 'NextResetDate="' and '"',
DUMMY30 filler char(200) Terminated by WHITESPACE enclosed by 'LastResetDate="' and '"',
EFFECTIVEDURATION Terminated by WHITESPACE enclosed by 'EffectiveDuration="' and '"',
DUMMY31 filler char(200) Terminated by WHITESPACE enclosed by 'Instrument="' and '"',
DUMMY32 filler char(200) Terminated by WHITESPACE enclosed by 'IssuerCode="' and '"',
DUMMY33 filler char(200) Terminated by WHITESPACE enclosed by 'IssuerName="' and '"',
DUMMY34 filler char(200) Terminated by WHITESPACE enclosed by 'IssuerREDCode="' and '"',
DUMMY35 filler char(200) Terminated by WHITESPACE enclosed by 'Strategy="' and '"',
DUMMY36 filler char(200) Terminated by WHITESPACE enclosed by 'StrikePrice="' and '"',
MATURITYDATE Terminated by WHITESPACE enclosed by 'MaturityDate="' and '"',
DUMMY37 filler char(200) Terminated by WHITESPACE enclosed by 'TickerSymbol="' and '"',
DUMMY38 filler char(200) Terminated by WHITESPACE enclosed by 'MetPay="' and '"',
DUMMY39 filler char(200) Terminated by WHITESPACE enclosed by 'MetRec="' and '"',
DUMMY40 filler char(200) Terminated by WHITESPACE enclosed by 'Payrec="' and '"',
DUMMY41 filler char(200) Terminated by WHITESPACE enclosed by 'RiskSection="' and '"',
DUMMY54 filler char(200) Terminated by WHITESPACE enclosed by 'HedgedItem="' and '"',
DUMMY43 filler char(200) Terminated by WHITESPACE enclosed by 'ResetFrequency="' and '"',
DUMMY44 filler char(200) Terminated by WHITESPACE enclosed by 'ResetFrequencyNumber="' and '"',
DUMMY45 filler char(200) Terminated by WHITESPACE enclosed by 'PaymentFrequency="' and '"',
DUMMY46 filler char(200) Terminated by WHITESPACE enclosed by 'PaymentFrequencyNumber="' and '"',
DUMMY47 filler char(200) Terminated by WHITESPACE enclosed by 'CapFloorCoupon="' and '"',
DUMMY48 filler char(200) Terminated by WHITESPACE enclosed by 'RefIndexRate="' and '">',
DUMMY50 filler char(1000) enclosed by "<Classification=" and "/>",
DUMMY51 filler char(1000) enclosed by "<Classification=" and "/>",
DUMMY52 filler char(1000) enclosed by "<Classification=" and "/>",
DUMMY53 filler char(1000) enclosed by "<Classification=" and "/>"
INTO TABLE DERIVATIVE_POSITION
WHEN PORTFOLIOCODE != ' '
FIELDS TERMINATED BY '</NewDataSet>' optionally enclosed by '"'
TRAILING NULLCOLS
DUMMY1 filler char(65000) terminated by '<DerivativePosition ',
DERIVATIVESECURITYID TERMINATED BY WHITESPACE enclosed by 'DerivativeSecurityID="' and '"',
PORTFOLIOCODE TERMINATED BY WHITESPACE enclosed by 'PortfolioCode="' and '"',
LONGSHORTINDICATOR TERMINATED BY WHITESPACE ENCLOSED BY 'LongShortIndicator="' and '"',
FAS157Level filler char(100) TERMINATED BY WHITESPACE enclosed by 'FAS157Level="' and '"',
FAS157MVAdjustable filler char(100) TERMINATED BY WHITESPACE enclosed by 'FAS157MVAdjustable="' and '"',
OriginalMV filler char(100)TERMINATED BY WHITESPACE enclosed by 'OriginalMV="' and '"',
FAS157MVDelta filler char(100) TERMINATED BY WHITESPACE enclosed by 'FAS157MVDelta="' and '"',
FAS157AdjustedMV filler char(100)TERMINATED BY WHITESPACE enclosed by 'FAS157AdjustedMV="' and '"',
CURRENTNOTIONALLOCAL TERMINATED BY WHITESPACE enclosed by 'CurrentNotionalLocal="' and '"',
CURRENTNOTIONALUSD TERMINATED BY WHITESPACE enclosed by 'CurrentNotionalUSD="' and '"',
OPENQUANTITY TERMINATED BY WHITESPACE enclosed by 'OpenQuantity="' and '"',
ORIGINALNOTIONALLOCAL TERMINATED BY WHITESPACE enclosed by 'OriginalNotionalLocal="' and '"',
ORIGINALNOTIONALUSD TERMINATED BY WHITESPACE enclosed by 'OriginalNotionalUSD="' and '"',
ORIGINALQUANTITY TERMINATED BY WHITESPACE enclosed by 'OriginalQuantity="' and '"',
ACCRUEDINTERESTLOCAL TERMINATED BY WHITESPACE enclosed by 'AccruedInterestLocal="' and '"',
ACCRUEDINTERESTUSD TERMINATED BY WHITESPACE enclosed by 'AccruedInterestUSD="' and '"',
ACCRUEDINTERESTBASE TERMINATED BY WHITESPACE enclosed by 'AccruedInterestBase="' and '"',
CLEANMARKETVALUEENDOFDAYLOCAL TERMINATED BY WHITESPACE enclosed by 'CleanMarketValueEndOfDayLocal="' and '"',
CLEANMARKETVALUEENDOFDAYUSD TERMINATED BY WHITESPACE enclosed by 'CleanMarketValueEndOfDayUSD="' and '"',
CLEANMARKETVALUEENDOFDAYBASE TERMINATED BY WHITESPACE enclosed by 'CleanMarketValueEndOfDayBase="' and '"',
DIRTYMARKETVALUEENDOFDAYLOCAL TERMINATED BY WHITESPACE enclosed by 'DirtyMarketValueEndOfDayLocal="' and '"',
DIRTYMARKETVALUEENDOFDAYUSD TERMINATED BY WHITESPACE enclosed by 'DirtyMarketValueEndOfDayUSD="' and '"',
PREMIUMLOCAL TERMINATED BY WHITESPACE enclosed by 'PremiumLocal="' and '"',
PREMIUMUSD TERMINATED BY WHITESPACE enclosed by 'PremiumUSD="' and '"',
PREMIUMBASE TERMINATED BY WHITESPACE enclosed by 'PremiumBase="' and '"',
BIDDIES TERMINATED BY WHITESPACE enclosed by 'Biddies="' and '"',
ADDONEXPOSUREUSD TERMINATED BY WHITESPACE enclosed by 'AddOnExposureUSD="' and '"',
RegulatoryExposureUSD filler char(100) TERMINATED BY WHITESPACE enclosed by 'RegulatoryExposureUSD="' and '"',
PARCR01 filler char(100) TERMINATED BY WHITESPACE enclosed by 'PARCR01="' and '"',
FAS133DESIGNATIONGAAP TERMINATED BY WHITESPACE enclosed by 'FAS133DesignationGAAP="' and '"',
FAS133DESIGNATIONSTAT TERMINATED BY WHITESPACE enclosed by 'FAS133DesignationSTAT="' and '"',
TRADEDATE TERMINATED BY WHITESPACE enclosed by 'TradeDate="' and '"',
EffectiveDate filler char(100) TERMINATED BY WHITESPACE enclosed by 'EffectiveDate="' and '"',
ALLOCATION TERMINATED BY WHITESPACE enclosed by 'Allocation="' and '"' "round(:ALLOCATION,4)",
DUMMY36 filler char(100) enclosed by '/' and '>'
Log:
Table DERIVATIVE_SECURITY:
4079 Rows successfully loaded.
0 Rows not loaded due to data errors.
28074 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Table DERIVATIVE_POSITION:
0 Rows successfully loaded.
0 Rows not loaded due to data errors.
32153 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 248196 bytes(26 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 32153
Total logical records rejected: 0
Total logical records discarded: 28074When there are multiple tables in a control file, SQL Loader assumes the data for the first file in the second table immediately follows the last field in the first table. You probably want SQL Loader to start looking for the first column of the second table at the start of the second table. You can do this by using the POSITION clause
DUMMY51 filler char(1000) enclosed by "<Classification=" and "/>",
DUMMY52 filler char(1000) enclosed by "<Classification=" and "/>",
DUMMY53 filler char(1000) enclosed by "<Classification=" and "/>"
INTO TABLE DERIVATIVE_POSITION
WHEN PORTFOLIOCODE != ' '
FIELDS TERMINATED BY '</NewDataSet>' optionally enclosed by '"'
TRAILING NULLCOLS
DUMMY1 filler *position(1)* char(65000) terminated by '<DerivativePosition ',
DERIVATIVESECURITYID TERMINATED BY WHITESPACE enclosed by 'DerivativeSecurityID="' and '"',
PORTFOLIOCODE TERMINATED BY WHITESPACE enclosed by 'PortfolioCode="' and '"',
. -
XML data inserting into Master, not Page
First, I fully admit that I jumped into the deep end of the pool. This is my first InDesign project and I'm trying to set up a 2-page spread master to support loading my content from XML.
Thanks to the great online help, these forums/community resources, and my growing bookshelf, I've successfully (or so it seems) created my Master layout, with frames appropriately tagged for my XML (as confirmed by the Structure). Since I'm very novice at InDesign, my testing isn't always unambiguous, but I HAVE successfully loaded my external XML (including graphics and anchored text, plus the story) into my document.
The problem is that the data seems to be flowing into the Master, and not the Page. So if I create all the pages I need ahead of time (right now just 2 spreads for my test), both pages are populated with the first record.
If I only create the first spread, then the data flows and I get the magic "+" for overflow, but I can't click on it in the Page. If I change to the Master I can click on it, which seems the major clue as to why this isn't working. I just have no idea how to fix it.
From everything I've read it seems that I should be able to tag my frames in the Master (which I want to reuse for other documents) and have the data flow into the Pages.
Am I missing something obvious?
Or is what I'm trying to do just not possible? (if so, does that mean I have to keep re-tagging the frames every time I make a new Document? ugh)
I've spent hours trying to resolve this, and would really appreciate a pointer to get me moving again.
Thanks!
julieHm, that seems contradictory to everything I've read and the examples I've seen (although I may have misunderstood the examples - none are quite as complicated as my layout). My text frame has several anchors for XML data that doesn't just flow with the story. How do I set up the tagged anchor frames I need if the story text frame is in the Master but not tagged?
I thought that the story text frame and the anchor frames (some graphics and some text) all needed to be in the same context. If I put them in the Master I get the XML flowing into the Master. If I put them into the Page then the new pages created don't have the proper frames. If I just put the story text frame in the Master, how do I associate it with the anchor frames (I only know how to do this by using the Story editor on the tagged frame as described in A Designer's Guide To Adobe InDesign and XML)?
I'll go back and work through the samples again, but I definitely don't understand how it all fits together.
Thanks!
j -
Flash xml data loading and unloading specs
hi i am trying to get specification information that i cannot
find anywhere else.
i am working a large flash project
and i would like to load xml data into the same swf
object/movieclip repeatedly.
as i do not want the previously loaded items to unload i need
to know if doing this will unload the items from the swf or just
keep them in the library so they can be reposted without reloading.
i cannot find any supporting documenation either way that
tells me that if i load new content into a clip (i am aware
levels overwrite) if it will or will not unload this content.
thanks in advance.
mkthis is awful for me -- i cant even get the clip to duplicate
-- and i thought this would be the simplest solution to keeping
everything cached for one page before and one page after current in
the project.
i have used a simpler clip to test the code and see if i am
insane.
duplicateMovieClip(_root.circle, "prv", 5);
prv._x = 300;
prv._y = 300;
prv._visible = true;
prv.startDrag();
this ALWAYS works when i use the _root.circle file of a green
simple circle
BUT
when i change it to my main movie clip (which is loaded AND
On screen -- it just doesnt duplicate at all!) -- i've even
triggered it to go play frame 2 JUST IN CASE
I've even set visibility to true JUST IN CASE
ie all i do is change _root.circle to _root.cur
and .... nada.
AND _root.cur IS DEFINITELY on the screen and all xml
components have been loaded into it. (it is a slide with a dynamic
picture and dynamic type and it 100% works)
has anyone had this insanity happen before?
is this an error where flash cannot attach movie or duplicate
a clip that has dynamic contents??? -
Hello Gurus,
Is the re a specific steps of sending XML data to BW ?
If yes, can I have the steps ?
ThanksHi Luis,
Follow the steps:
1) Install XML 3.0 parser
2) Create an Info source
3) Assign PC file as data source
4) Create transfer and communication structure and activate
5) From transfer structure screen select Extras -- Create BW datasource with SOAP(Simple Object Access Protocol)
6) After succesful generation, the data source is connected to myself datamart; the name of the data data source is 6A*
7)Create an info package
8) Load data from XML file
TQ
Kumar -
How is data loaded into the BCS cubes?
We are on SEM-BW 4.0 package level 13. I'm totally new to BCS from the BW view point. I'm not the SEM person but I support the BW side.
Can anyone explain to me or point me to documentation that explains how the data gets loaded into cube 0BCS_C11 Consolidation (Company/Cons Profit Center? I installed the delivered content and I can see various export data sources that were generated. However I do not see the traditional update rules, infosources etc.
The SEM person has test loaded some data to this cube and I can see the request under 'manage' and even display the content. However the status light remains yellow and data is not available for reporting unless I manually set the status to green.
Also, I see on the manage tab under Info-Package this note: Request loaded using the APO interface without monitor log.
Any and all assistance is greatly appreciated.
Thanks
DennyHi Dennis,
For reporting the virtual cube 0BCS_VC11 which is fed by 0BCS_C11 is used.
You don't need to concern about the yellow status. The request is closed automatically after reaching 50000 records.
About datastream - you right - the BW cube is used.
And if your BW has some cubes with information for BCS on a monthly basis, you may arrange a load from a data stream.
This BW cube I make as much similar to 0BCS_C11 by structure as possible -- for a smooth data load. The cube might be fed by another cube which contains information in another format. In update rules of the first cube you may transform the data for compatibility of the cubes structure.
Best regards,
Eugene -
Hi all,
I am trying to use UTL_FILE.PUT_LINE to output an XML file. This xml file is the result of a select query. Hence converting the result of the select query into xml format using dbms_xmlgen.getxml and storing it in a variable. But the resultant xml data > 32k . What would be the best way to store it in variable? When using CLOB datatype, getting a numeric value error.
Thanks in advance.As you didn't post the code, nor included a four digit database version, you are asking someone to look in a crystal ball.
Sorry, they are out for repair.
Also the XDB forum would be more appropriate.
Sybrand Bakker
Senior Oracle DBA
Maybe you are looking for
-
No volume only when logged in in lion
I am having trouble with my volume. The speakers are not the problem because when i boot it up, i get the Mac's startup sound loud and clear. After i log in to any of my accounts the volume gone. It is not on mute. I am running Mac OS X Lion. Please
-
How to connect to DB2 from ABAP program
Hi, In my ABAP program I need to retrieve 3 fields from one table of DB2 database. I have tried creating an RFC connection in SM59 and then using native SQL in abap code but could not succeed. I have also maintained an entry in DBCON table but am una
-
I've just updated itunes on my windows (vista) pc and iphone 3gs. Somehow I've lost all my calendar dataI can't find a previous backup on my pc or icloud in Sept which I know I have done! Having re-entered some appointments in my MS Outlook calendar
-
Uploading project documents onto Solution manager
Hi guys Please can anyone out there take me through step by step on how I can upload project documents onto the Solution Manager server from a local file. Thanks in advance. Regards Gwangwaz
-
ESS mixed up users & ESS user not existing in this period
Hi all I really hope you can help me in one way or an other. I am still quite new to SAP and SDN so have a bit leniency At my company we have two demo system here and my job is right now to implement the ESS scenario. We have on the one side a EP 7.0