Idoc status 51 while loading from R/3 to PSA
hi experts,
I am loading from R/3 to BW (fiar data), the request is in yellow status and running for long time and showing zero(0) records.
i have checked that some of the idocs are not posted.
when i try to push through BD87 still the status is showing 51 only.
can you plz giv solution for this.
regards
venuscm
Hi,
Maybe this is due to bugs in the TRFC programs that are solved with the
attached notes:
1280898,,IDoc hangs when it is to be sent immediately with
1338739 RFC calls in the wrong system
1377863 max no of gateways exceeded
Also please refer to the following note for additional info:
1250813 SAPLARFC uses all dialog work processes
1403974 Determining the maximum connections in transactione error
I hope it helps you.
Regards,
Fran.
Similar Messages
-
Data Records are missing in between while loading from R/3 (ECC) to BI.
Dear Experts,
I have created a custom DataSource on Custom Function Module. This datasource contains 600 fields. (I know its a monster and splitting options are thinner).
1) Validate the data using RSA3 in R/3 and showed the correct record count.
2) Validate the data by debugging the FM, still showed the correct record count.
But while loading from R/3 to BI, records are missing.
Various Scenarios load from R/3 to BI:
1a) Loaded full load (78000 records) with all default data transfer settings. PSA showed up with 72000 records (missing 6000) only. Compared the Idocs vs data packets, both reconciled.
1b) Loaded full load (78000) with modified settings (15000 KB / data packet). PSA showed up with 74000 records (missing 4000) only.
2a) Loaded with selection parameters (took a small chunk) (7000 records) with default data transfer settings. PSA showed up only 5000 records (missing 2000).
2b) Loaded with selection parameters (7000 records) with modified settings (15000 KB / data packet). PSA showed up all 7000 records.
3a) Loaded with selection parameters (took further small chunk) (4000 records). PSA showed up all records regardless data transfer settings.
Also please look at this piece of code from the function module,
IF l_wa_interface-isource = 'ZBI_ARD_TRANS'.
l_package_size = l_wa_interface-maxsize DIV 60.
ENDIF.
I really appreciate your advise or help in this regard.
Thanks much,
AnilHi,
Which module u want?
if its SD(for example)
steps>>
1>In AWB goto "business content"
2> goto "Info provider"
3>Under infoarea select SD cubes
4> Drag related cubes and ODS to right panel
5> Set the grouping option "In Data flow before&afterwards"
6>Install the collected objects
Go to R/3
7> Use Tcode RSA5 Transfer all regarding SD module datasources
Goto BW
8> Right click on the source system "Replicate datasources"
[DataSources|http://help.sap.com/saphelp_nw70/helpdata/en/3c/7b88408bc0bb4de10000000a1550b0/frameset.htm]
Edited by: Obily on Jul 10, 2008 8:36 AM -
Error while loading from PSA to Cube. RSM2-704 & RSAR -119
Dear Gurus,
I have to extract about 1.13 million records from set up tables to the MM cube 0IC_C03. While doing, the following two errors occured while loading from the psa to the cube:
Error ID - RSM2 & error no. 704: Data records in the PSA were marked for data package 39 . Here there were 2 errors. The system wrote information or warnings for the remaining data records.
Error ID- RSAR & No. 119: The update delivered the error code 4 .
(Data records for package 39 selected in PSA - 2 error(s). Record 5129 :No SID found for value 'EMN ' of characteristic 0BASE_UOM . Record 5132 :No SID found for value 'EMN ' of characteristic 0BASE_UOM )
I tried to change the defective records in psa by deleting the erraneous field value EMN and tried to load but failed.
Now, my questions are:
How can I resolve the issue ?
(ii) How to rectify the erraneous record ? should we only delete the erraneous field value or delete the entire record from the psa.
(iii) How to delete a record from psa.
Thanks & regards,
Sheeja.Hi,
Data records for package 39 selected in PSA - 2 error(s). Record 5129 :No SID found for value 'EMN ' of characteristic 0BASE_UOM . Record 5132 :No SID found for value 'EMN ' of characteristic 0BASE_UOM
The issue with record no. 5129 and 5132.
In PSA check errorneous records and it will display only the error records, you just edit as per the require and try reloading into cube.
Deleting single record is not possible.
Let us know if you still have any issues.
Reg
Pra -
Data flow tasks faills while loading from database to excel
Hello,
I am getting error while loading from oledb source to excel and the error as shown below.
Error: 0xC0202009 at DFT - Company EX, OLE DB Destination [198]: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
Error: 0xC0209029 at DFT - Company EX, OLE DB Destination [198]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "input "OLE DB Destination Input" (211)" failed because error code 0xC020907B occurred, and the error row
disposition on "input "OLE DB Destination Input" (211)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the
failure.
Error: 0xC0047022 at DFT - Company EX: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "OLE DB Destination" (198) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput
method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
Error: 0xC02020C4 at DFT - Company EX, OLE DB Source 1 [1]: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
Error: 0xC0047021 at DFT - Company EX: SSIS Error Code DTS_E_THREADFAILED. Thread "WorkThread0" has exited with error code 0xC0209029. There may be error messages posted before this with more information on why the thread has exited.
Error: 0xC0047038 at DFT - Company EX: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component "OLE DB Source 1" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine
called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
Error: 0xC0047021 at DFT - Company EX: SSIS Error Code DTS_E_THREADFAILED. Thread "SourceThread0" has exited with error code 0xC0047038. There may be error messages posted before this with more information on why the thread has exited.
Any help would be appreciated ASAP.
Thanks,
Vinay sYou can use this code to import from SQL Server to Excel . . .
Sub ADOExcelSQLServer()
' Carl SQL Server Connection
' FOR THIS CODE TO WORK
' In VBE you need to go Tools References and check Microsoft Active X Data Objects 2.x library
Dim Cn As ADODB.Connection
Dim Server_Name As String
Dim Database_Name As String
Dim User_ID As String
Dim Password As String
Dim SQLStr As String
Dim rs As ADODB.Recordset
Set rs = New ADODB.Recordset
Server_Name = "EXCEL-PC\EXCELDEVELOPER" ' Enter your server name here
Database_Name = "AdventureWorksLT2012" ' Enter your database name here
User_ID = "" ' enter your user ID here
Password = "" ' Enter your password here
SQLStr = "SELECT * FROM [SalesLT].[Customer]" ' Enter your SQL here
Set Cn = New ADODB.Connection
Cn.Open "Driver={SQL Server};Server=" & Server_Name & ";Database=" & Database_Name & _
";Uid=" & User_ID & ";Pwd=" & Password & ";"
rs.Open SQLStr, Cn, adOpenStatic
' Dump to spreadsheet
With Worksheets("sheet1").Range("a1:z500") ' Enter your sheet name and range here
.ClearContents
.CopyFromRecordset rs
End With
' Tidy up
rs.Close
Set rs = Nothing
Cn.Close
Set Cn = Nothing
End Sub
Also, check this out . . .
Sub ADOExcelSQLServer()
Dim Cn As ADODB.Connection
Dim Server_Name As String
Dim Database_Name As String
Dim User_ID As String
Dim Password As String
Dim SQLStr As String
Dim rs As ADODB.Recordset
Set rs = New ADODB.Recordset
Server_Name = "LAPTOP\SQL_EXPRESS" ' Enter your server name here
Database_Name = "Northwind" ' Enter your database name here
User_ID = "" ' enter your user ID here
Password = "" ' Enter your password here
SQLStr = "SELECT * FROM Orders" ' Enter your SQL here
Set Cn = New ADODB.Connection
Cn.Open "Driver={SQL Server};Server=" & Server_Name & ";Database=" & Database_Name & _
";Uid=" & User_ID & ";Pwd=" & Password & ";"
rs.Open SQLStr, Cn, adOpenStatic
With Worksheets("Sheet1").Range("A2:Z500")
.ClearContents
.CopyFromRecordset rs
End With
rs.Close
Set rs = Nothing
Cn.Close
Set Cn = Nothing
End Sub
Finally, if you want to incorporate a Where clause . . .
Sub ImportFromSQLServer()
Dim Cn As ADODB.Connection
Dim Server_Name As String
Dim Database_Name As String
Dim User_ID As String
Dim Password As String
Dim SQLStr As String
Dim RS As ADODB.Recordset
Set RS = New ADODB.Recordset
Server_Name = "Excel-PC\SQLEXPRESS"
Database_Name = "Northwind"
'User_ID = "******"
'Password = "****"
SQLStr = "select * from dbo.TBL where EMPID = '2'" 'and PostingDate = '2006-06-08'"
Set Cn = New ADODB.Connection
Cn.Open "Driver={SQL Server};Server=" & Server_Name & ";Database=" & Database_Name & ";"
'& ";Uid=" & User_ID & ";Pwd=" & Password & ";"
RS.Open SQLStr, Cn, adOpenStatic
With Worksheets("Sheet1").Range("A1")
.ClearContents
.CopyFromRecordset RS
End With
RS.Close
Set RS = Nothing
Cn.Close
Set Cn = Nothing
End Sub
Knowledge is the only thing that I can give you, and still retain, and we are both better off for it. -
Hi,
We are facing some issue while loading the data from R/3 to BW using DTP.
We are getting the error (both for full and delta)as follows:
Data package 1:Errors during Processing
* Extraction datasource*
* Filter out new records with same key.*
* RSDS material*
* Update to datastore object*
* Unpack data package*
* Exception in substep:write data package*
* Processing terminated.*
* Set technical status to red*
* Set overall status to red*
* Set overall status to green*
* Further processing started*
* Set status to u2018Processed furtheru2019*
The expansion of the error Exception in substep: write data package:
Record 0, segment 0001 is not in teh cross-record table
Error while writing error stack
Record 0, segment 0001 is not in the cross-record table
Error in substep update to DataStore Object
We are using both start routine and end routine in the transformation.
Any help on this is highly appreciable.
End routine:
IF NOT RESULT_PACKAGE IS INITIAL.
SELECT salesorg comp_code FROM /bi0/psalesorg
INTO TABLE it_salesorg
FOR ALL ENTRIES IN RESULT_PACKAGE
WHERE salesorg = RESULT_PACKAGE-SALESORG
AND objvers = 'A'.
SELECT /bic/ZMATRTUGP FROM /bic/pZMATRTUGP
INTO TABLE it_mat
FOR ALL ENTRIES IN RESULT_PACKAGE
WHERE /bic/ZMATRTUGP = RESULT_PACKAGE-/bic/ZMATRTUGP.
it_data[] = RESULT_PACKAGE[].
ENDIF.
get company code from salesorg for all material
LOOP AT it_data INTO st_data.
w_tabix = sy-tabix.
READ TABLE it_salesorg INTO st_salesorg
WITH TABLE KEY salesorg = st_data-salesorg.
IF sy-subrc = 0.
st_data-comp_code = st_salesorg-compcode.
MODIFY it_data FROM st_data INDEX w_tabix.
ENDIF.
ENDLOOP.
REFRESH RESULT_PACKAGE.
RESULT_PACKAGE[] = it_data[].
*Finally check for each material if something has changed since the last
*loading : if yes compare and update, otherwise no upload
IF NOT it_mat[] IS INITIAL.
SELECT * FROM /bic/azalomrtu00
INTO TABLE it_check
FOR ALL ENTRIES IN it_mat
WHERE /bic/ZMATRTUGP = it_mat-zmaterial.
ENDIF.
DELETE it_check WHERE salesorg IS INITIAL.
CLEAR st_result_package.
LOOP AT RESULT_PACKAGE INTO st_RESULT_PACKAGE.
READ TABLE it_check INTO st_check
WITH TABLE KEY
/bic/ZMATRTUGP = st_result_package-/bic/ZMATRTUGP
CUST_GRP4 = st_result_package-CUST_GRP4
SALESORG = st_result_package-SALESORG
VALIDTO = st_result_package-VALIDTO.
IF sy-subrc = 0.
*If one of the characteristic is different, let that entry in the
*result_package otherwise no use updating it
IF st_check-/BIC/ZCSBRTUGP = st_result_package-/BIC/ZCSBRTUGP
AND st_check-/BIC/ZCONDTYPE = st_result_package-/BIC/ZCONDTYPE
AND st_check-comp_code = st_result_package-comp_code
AND st_check-/BIC/ZCNT_RTU = st_result_package-/bic/ZCNT_RTU
AND st_check-VALIDFROM = st_result_package-VALIDFROM.
DELETE RESULT_PACKAGE.
ELSE.
*entry is new : let it updated.
ENDIF.
DELETE it_check WHERE
/bic/ZMATRTUGP = st_result_package-/bic/ZMATRTUGP
AND CUST_GRP4 = st_result_package-CUST_GRP4
AND SALESORG = st_result_package-SALESORG
AND VALIDTO = st_result_package-VALIDTO.
ENDIF.
ENDLOOP.
*if some entries are in the ODS but not in the datapackage, they have to
*be deleted in the ODS since they don't exist anymore
LOOP AT it_check INTO st_check.
CLEAR st_result_package.
st_result_package-/bic/ZMATRTUGP = st_check-/bic/ZMATRTUGP.
st_result_package-CUST_GRP4 = st_check-CUST_GRP4.
st_result_package-SALESORG = st_check-SALESORG.
st_result_package-VALIDTO = st_check-VALIDTO.
st_result_package-recordmode = 'D'.
APPEND st_result_package TO RESULT_PACKAGE.
ENDLOOP.
Thanks in advance.
Thanks,
MeghanaHi Meghana
HAve you got an short dump in ST22 transaction ?
If yes, what is it ?
In your routine, maybe you should as well add a counter for index in your loop :
DATA : w_idx like sy-tabix.
When you loop at result_package, do : w_idx = sy-tabix.
And make : DELETE result_package INDEX w_idex.
Hope it helps you
Mickael -
Hi Gurus,
I am getting error "<b>Idocs missing: No end-confirmation arrived in the Warehouse from</b>" while loading data from R3 to BW via process chain.
Can any one please suggest me how solve this problem.
Thanks in advance.
Regards
TonyHi,
It may be due to below reasons...
1. Check whether the Log Table is full in the BW side.
2. Check Whether the Connection to Source System is fine.
3. System might be busy , Many jobs running in system.
Go to SM58 in BW system , find TRFC queue with enteries for each LUW . U can come to know why IDOC is not processed by looking at the error there . Also check whether the job is finished successfully in Source System. U can process the LUW in Sm58 manually from Edit menu entry ....
Thanks
DSt -
DUmp Erro ASSIGN_TYPE_CONFLICT while loading from DSO to INFOCUBE
HI ALL,
Here i had rquirement of modifying the Tranformation rules and i have done some routine coding for a particular infoobject
and deleted the data from DOS & Cube , then i started Loading to DSO, here i got the sucess for DSO, now i have got the modified data of the particular Infoobject in the DSO,
But i am facing a Dump Error while loading data from DSO to Cube thorugh the DTP .
HEre is the Error "ASSIGN_TYPE_CONFLICT" which i see in ST22.
Plz,
Thanks in Advance,
sravanHI,
When i started the load for the first time i got he same error, so i have activated the DTP and laoded it again,
then i faced the same problem,
where the Modified data is already in DSO and i have validated the Data also, that is ok ...
so i need to Delete or to need to create another DTP to load the data from DSA to CUBE.
and even i have cheked all the tansformation rules they are all fine the DSO structure and Infocube Structure is ok,
Please suggest,
Thanks LAX, -
How to validate data is in specific list while loading from SQL*Loader
I have a sample data file like below
1,name1,05/02/2012 10:00:00,blue
2,name2,06/02/2012 10:00:00,red
3,name3,07/02/2012 10:00:00,yellow
4,name4,08/02/2012 10:00:00,white
I would like to validate 4Th column to be a valid color (ie) All color should be in a specific list, if it is not in the lis then the record should do to bad/discard file
How can do that while loading Data From SQL*Loader?user8860934 wrote:
I have a sample data file like below
1,name1,05/02/2012 10:00:00,blue
2,name2,06/02/2012 10:00:00,red
3,name3,07/02/2012 10:00:00,yellow
4,name4,08/02/2012 10:00:00,white
I would like to validate 4Th column to be a valid color (ie) All color should be in a specific list, if it is not in the lis then the record should do to bad/discard file
How can do that while loading Data From SQL*Loader?Probably a lot easier with an EXTERNAL TABLE (they're much more flexible).
Is SQL Loader a mandatory requirement for some reason? -
Tracking history while loading from flat file
Dear Experts
I have a scenario where in have to load data from flat file in regular intervals and also want to track the changes made to the data.
the data will be like this and it keep on changes and the key is P1,A1,C1.
DAY 1---P1,A1,C1,100,120,100
DAY 2-- P1,A1,C1,125,123,190
DAY 3-- P1, A1, C1, 134,111,135
DAY 4-- P1,A1,C1,888,234,129
I am planing to load data into an ODS and then to infocube for reporting what will be the result and how to track history like if i want to see what was the data on Day1 and also current data for P1,A1,C1.
Just to mention as i am loading from flat file i am not mapping RECORDMODE in ODS from flat file..
Thanks and regards
Neel
Message was edited by:
Neel KamalHi
You don't mention your BI release level, so I will assume you are on the current release, SAP NetWeaver BI 2004s.
Consider loading to a write-optimized DataStore object, to strore the data. That way, you automatically will have a unqiue technical key for each record and it will be kept for historical purposes. In addition, load to a standard DataStore object which will track the changes in the change log. Then, load the cube from the change log (will avoid your summarization concern), as the changes will be updates (after images) in the standard DataStore Object.
Thanks for any points you choose to assign
Best Regards -
Ron Silberstein
SAP -
Short dump while loading from ods1 to ods2
hi,
When iam loading from ODS1 to ODS2 geting shortdump.I generated ODS1 as datasource and iam loading the data.Can i able to delete the generated datasource from datamarts.
bye
GKGo to details tab n then analsye ... it will help ..
get basis on to the loop if required .....
let us know if nay queries........
regards -
Kumar Sarvepalli .../// Honor points -
Error while loading 0Plant master data in PSA
Hi,
i am trying to load attributes master data for 0PLANT, but in PSA i get the following error:
<i><b>'Record 1 :Plant CU : No sales organization found; Local currency cannot be checked'</b> .</i>
We are not using SD module and so no sales organization data is maintained have through a couple of posts posted here in SDN,and have applied the correction instructions manually as stated in OSS note : 750322.(As mentioned for case 3). But i still get the same error.Hi,
The issue is due to errorneous record.
Delete the request in Master data and Try the following.
1) Goto PSA --> filter the Status (Red or Green)
Select the Red records and correct the data in which error is throwm.
After correcting the data, save and Right click PSA and give Start update immediately.
2) Load via Infopackage only to PSA and later update from PSA and check.
Any one of the above 2 methods will solve the issue.
Regards,
Balaji V -
Doesn't load from source system to PSA and DSO
Hi Gurus,
We have launched a Delta load from Source System (POSDM) to DSO. In the source system there are 5000000 rows but none was loaded into DSO or PSA. The delta load was ok, but any data was loaded.
If the delta load was lunch again, would be loaded those 5000000 rows. Or those 5000000 rows wouldn't be able to load again.
Any idea abut this issue? Any feedback will be really appreciated.
Thanks in advance.Hi David,
Are you sure this 5 million records are delta records and should be pulled into BW as delta??
did you count the number of records in the underlying tables??
which data source you are using??
Delta loads are suppose to bring the new records or changed records since the last data load and not all the records.
Since the request is green as you said and still it shows 0 records then it is possible that nothing has changed since last delta.try to see the details of the data load... if it failed...may be thats the reason you are not able to see any records.
If you schedule the delta again it will bring the records changed or created after last delta.
If the delta was unsuccessfull then turn the QM status and overall status of request to red manually in the monitor of the request and delete it from PSA and all the targets in which its loaded and schedule the delta again and it should bring the delta.
Thanks
Ajeet
Thanks
Ajeet -
Error while loading from Flat file
Hi,
I'm trying to load master data from a flat file. After loading I get the following message even though I have used PSA as staging area. I see that data is present in the master data table (Text) without any errors though. Kindly help.
Error when updating Idocs in Business Information Warehouse
Diagnosis
Errors have been reported in Business Information Warehouse during IDoc update:
EDI: Partner profile not available
System response
There are IDocs with incorrect status.
Procedure
Check the IDocs in Business Information Warehouse . You can get here using the BW Monitor.
Removing errors:
How you remove the errors depends on the error message you receive.Check your flat file format and field separator. Is it .txt or .csv file?
Go to "Source systems" in "Modeling" chose your source system for flat files and perform check.
Maybe problem is with your source system...
This might also be due to logical names of the systems not being maintained. Also has anyone doen a flat file upload before this on the system and what is the version ? -
Idoc issue while loading the data into BI
Hello Gurus,
Initially i am having the Source System connection problem. After it was fixed by the basis i had followed the below process.
I am loading the data using the Generic extractor specifying the selection and its a full load. When i load the data from R3 PP application i found the below result in the monitor screen of BW.
1. Job was completed in the R3 system and 1 million records are fetched by extractor.
2. the records are not posted to the BW side Because of TRFC issue.
3. i got the idoc number and process it in the BD87 tCode. But it was not process sucessfully it gives the following error " Error when saving BOM. Please see the application log."
4. when i check the Application Log using the Tcode SLG1 with the help of time and date of that particular process it was in the yelow status.
Kindly let me know i can i resolve this issue. i have already tried by repeating the infopackage job still i am facing the same issue. I have also check the connection its is ok.
Regardshello veerendra,
Thanks for your quick response. yes i am able to process it manually. after processing it was ended with the status 51 application not posted.
could you pls help me out with the same
regard
Edited by: KK on Nov 4, 2009 2:19 AM
Edited by: KK on Nov 4, 2009 2:28 AM -
Transformation Rule: Error while loading from PSA to ODS using DTP
Hi Experts,
I am trying to load data from PSA to ODS using DTP. For about 101 records I get the following error:
"Runtime error while executing rule -> see long text RSTRAN 301"
On further looking at the long text:
Diagnosis
An error occurred while executing a transformation rule:
The exact error message is:
Overflow converting from ''
The error was triggered at the following point in the program:
GP4808B5A4QZRB6KTPVU57SZ98Z 3542
System Response
Processing the data record has been terminated.
Procedure
The following additional information is included in the higher-level
node of the monitor:
o Transformation ID
o Data record number of the source record
o Number and name of the rule which produced the error
Procedure for System Administration
When looking at the detail:
Error Location: Object Type TRFN
Error Location: Object Name 06BOK6W69BGQJR41BXXPE8EMPP00G6HF
Error Location: Operation Type DIRECT
Error Location: Operation Name
Error Location: Operation ID 00177 0000
Error Severity 100
Original Record: Segment 0001
Original Record: Number 2
Pls can anyone help in deducing and pointing this error to the exact spot in the transformation rule
Thanks & Regards,
RajJerome,
The same issue.
Here are some fields which are different in terms of length when mapped in transformation rules
ODS |Data Source
PROD_CATEG CHAR32 |Category_GUID RAW 16
CRM_QTYEXP INT4 |EXPONENT INT2
CRM_EXCRAT FLTP16 |EXCHG_RATE Dec 9
CRM_GWEIGH QUAN 17, 3 |Gross_Weight QUAN 15
NWEIGH QUAN 17, 3 |Net_Weight QUAN 15
CRMLREQDAT DATS 8 |REQ_DLV_DATE Dec 15
The difference is either some dats field are mapped to decimal, or the char 32 field is mapped to raw 16 OR Calweek, Calmonth is mapped to Calday
Both mostly all the ods field size is greater than the input source field.
Thanks
Raj
Maybe you are looking for
-
Many apps crashing after iOS 5 update and more. Think Apple will give me a trade-in?
I just typed about a thousand words and lost them. I noticed that my owned hardware list hadn't been updated in years and tapped on it. The back button returned me here and all that was left was the title of the post and the tags. NOTE: SPELL CHECKE
-
Hardcode values of the node with cardinality 0..n
Good day! I need to hardcode a list of element to the node with cardinality 0..n. Node has two attributes. How can I fill the node with a few element in the input mapping? I know I can do it with a single element by separating values by comma. Is it
-
Run a Program Through my java program
Good day to you :) I am wondering if there is anyway for me to run an executable (or batch) file though a program I made. Thank you :)
-
All Intel iMac are running 10.4.4 out of the box, right?
I got my iMac about a couple weeks ago. Since the Intel iMac was released on the same day that the 10.4.4 update came out, I assume all the new Intel iMacs already have that update installed. Am I correct in assuming that?
-
Hello all. I'm afraid that other than the texts saved on my previous SIM, the ones from 2011 are not accessible or erased completely. Any ideas?