OutOfMemoryError while reading flat files
Hi,
I am redesigning a Java program. This program reads two huge falt files, compares the data-records and write results into three different files. It was written in jdk1.1 before and it uses String everywhere to compare and process the data. Now since I am re-writing this again, I am trying to use HashTable/HashMap so that the data comparison will be a lot more easier and fatser.
Each of the two input files can have more than 400000 records. I am getting OutOfMemoryError while I read and load my files into a HashMap.
Is it because that HashMap/HashTable cant be that big or is it that my JVM is dying?
What else can I use to read these many key-value pair?
Thanks a bunch
Hi,
I've been facing the same problem although I have
ve 1024MB Ram....
That happens when I try to read a 64MB File.
From what I've heard, It might be a memory leak
ak or..... whatever... It is a JRE problem...You're telling the JRE to use the default maximum heap size (64MB) by not specifying a value suitable for your application with the -Xmx option.
It is a JRE problem only in so much as the error doesn't point you to the JRE memory options documentation.
Pete
Similar Messages
-
Risk factors while loading flat files
HI,
CAn anybody tell me what are risk factors while loading
flat files....
Thanks in advance,
regards,
RagsRisks:
1) If flat-file is created manually
- users may forget columns, specifically UNIT and
CURRENCY.
- Keyfigures my be presented in thousands one time
and in single dollars the other time.
- Flat-file may be delivered too late.
- Automatic scheduling is difficult if flat-file
comes from client station. You even can't do it
with a process chain.
- Naming conventions (capitals, leading zeroes,
trailing spaces, special characters)
- Master data conversion
2) If flat-file is created automatically (from legacy
system):
- Data may arrive too late
- Data file (if loaded from server) requires a
connection to the external disk - security risk.
- (Master) Data conversion
Note: In the past I had trouble with uploading flat-files from a network-drive. It sometimes happened that the flatfile was only partly imported and that no errors where reported. When loading data from my local C-drive I never had this problem.
My advise is to upload a flat-file into an ODS before using the data further in BW. In the ODS you can add a date to mark the loading data. If the loading to an ODS is ok then there is a much bigger change that you will not get problems further in the data-stream. Make sure that the key-user validates the data in the ODS before using it any further. It pays off to create one or two special queries on this ODS to enable testing of the data. -
I created and deployed a BPEL process with a JMS adapter and DB adapter.
I have tested this and it works fine.
After a few days, I try to open the project and the BPEL process, and click on the DB
adapter and it throws an exception on the designer :
"Error while reading wsdl file ... (wsdl file name) NULL Exception"
This is hard to debug as the exception does not provide any info, NULL.
This has happened several times for my project. The next time the JMS adapter hit the
same exception while opening on designer. This one I notice happened after I imported
a new schema file (xsd) into the BPEL process.
Only way around that I have for this currently is to delete and recreate the adapters, which as time
consuming as I have to re-create and re-assign a lot of the activities.
Has anyone encountered this and is there a way to fix this without deleting/recreating the adapters ?
Thanks.I experienced the same issue and found the cause and a workaround;
"Error while reading wsdl file …. Exception: null"
http://www.petervannes.nl/files/b7c08911ce3cde3677e2182bbc5f032a-47.php
Edited by: 944333 on Jul 3, 2012 10:24 PM -
Getting error while loading Flat File
Hello All,
I am getting error while loading flat file. Flat file is a CSV file.
Value ',,,,,,,,' of characteristic 0DATE is not a number with 000008 spaces
Data seprator |
Escape sign ;
It has 23708 entries , it s loading successfully till 23 665 entries
Besides when checked in PSA
for record having entries >23667 has calender day as ,,,,,, where as rest entries are having date
Besides when i checked in Flat file ,the total number of rows is 23,667 is there but i wonder why it has got 23,708 in
RSMO
Could you please let me know how to correct.
regards
pathHi,
For date column you should maintain YYYYMMDD formate Eg: 20090601, kepp cursor on date column and right click and Formate >Custome>make it 00000000 then save teh file as .CSV . First type values on column and do like this formate and save it and without opening it load it. Once you open it you losw 00000000 formate you need to give again the same formate.
Settings in Infopackage:
Data Format = CSV
Data Separator = ,
Escape Sign = ;
http://help.sap.com/saphelp_nw2004s/helpdata/en/43/01ed2fe3811a77e10000000a422035/content.htm
http://help.sap.com/saphelp_nw2004s/helpdata/en/80/1a6567e07211d2acb80000e829fbfe/content.htm
Thanks
Reddy -
Problem while reading XML file from Aplication server(Al11)
Hi Experts
I am facing a problem while reading XML file from Aplication server using open data set.
OPEN DATASET v_dsn IN BINARY MODE FOR INPUT.
IF sy-subrc <> 0.
EXIT.
ENDIF.
READ DATASET v_dsn INTO v_rec.
WHILE sy-subrc <> 0.
ENDWHILE.
CLOSE DATASET v_dsn.
The XML file contains the details from an IDOC number , the expected output is XML file giving all the segments details in a single page and send the user in lotus note as an attachment, But in the present output after opening the attachment i am getting a single XML file which contains most of the segments ,but in the bottom part it is giving the below error .
- <E1EDT13 SEGMENT="1">
<QUALF>001</QUALF>
<NTANF>20110803</NTANF>
<NTANZ>080000</NTANZ>
<NTEND>20110803<The XML page cannot be displayed
Cannot view XML input using XSL style sheet. Please correct the error and then click the Refresh button, or try again later.
Invalid at the top level of the document. Error processing resource 'file:///C:/TEMP/notesD52F4D/SHPORD_0080005842.xml'.
/SPAN></NTEND>
<NTENZ>000000</NTENZ>
for all the xml its giving the error in bottom part , but once we open the source code and if we saved in system without changing anything the file giving the xml file without any error in that .
could any one can help to solve this issue .Hi Oliver
Thanx for your reply.
see the latest output
- <E1EDT13 SEGMENT="1">
<QUALF>003</QUALF>
<NTANF>20110803</NTANF>
<NTANZ>080000</NTANZ>
<NTEND>20110803</NTEND>
<NTENZ>000000</NTENZ>
<ISDD>00000000</ISDD>
<ISDZ>000000</ISDZ>
<IEDD>00000000</IEDD>
<IEDZ>000000</IEDZ>
</E1EDT13>
- <E1EDT13 SEGMENT="1">
<QUALF>001</QUALF>
<NTANF>20110803</NTANF>
<NTANZ>080000</NTANZ>
<NTEND>20110803<The XML page cannot be displayed
Cannot view XML input using XSL style sheet. Please correct the error and then click the Refresh button, or try again later.
Invalid at the top level of the document. Error processing resource 'file:///C:/TEMP/notesD52F4D/~1922011.xml'.
/SPAN></NTEND>
<NTENZ>000000</NTENZ>
E1EDT13 with QUALF>003 and <E1EDT13 SEGMENT="1">
with <QUALF>001 having almost same segment data . but E1EDT13 with QUALF>003 is populating all segment data
properly ,but E1EDT13 with QUALF>001 is giving in between. -
Ignoring last 2 lines while reading the file
Hi All,
I have a file structure as mentioned below :
ab
ab
ab
ab
=======
=
While reading a file , i need to ignore the last 2 lines . How to achieve this using FCC parameters.
Regards
Vinay P.Hi,
I am not aware of any parameters in FCC to ignore last lines but work around can be :
You may create one structure to read last 2 lines if depending on file structure and ignore it in mapping (map all records except this structure).
Regards,
Beena. -
How to read flat file and convert to xml throught OSB
Hi ,
Can somebody help how to read flat file and convert to xml in OSB.
appreciate ur help.
Thanks & Regards ,
Siva K Diviif you're using the oepe with osb plugin (will be installed when you install the osb locally) and then in your osb project > rightmouseclick > new > MFL.
that's it
maybe you're trying to create it within an oepe installation which doesnt have the osb plugin ? -
Error while loading flat file into DSO
Hi
I am loading data from a flat file into a DSO. My fields in the flat file are Trans_Dt, (CHAR) Particulars (CHAR), Card_Name, (CHAR) Exps_Type, (CHAR)
Debit_Amount,Credit_Amount,***._Amt,Open_Bal_Check_Acnt, (CURR)
0Currency (CHAR)
In the proposal tab apart from the above mentioned fields 3 additional fields viz, field 10, field 11, and field 12 have come. How did these 3 additional fields come when I don't have any additional fields in my flat file? I've deleted these extra 3 fields though.
When I activate the DataSource it is getting activated but then I get the message 'Data structures were changed. Start transactions before hand'. What does this message mean?
When I hit the 'Read preview data' button it doesn't show me any data and gives me the error Missing reference to currency field / unit field for the fields Debit_Amount,Credit_Amount,***._Amt,Open_Bal_Check_Acnt
How do I create a reference field to the above mentioned fields?
Earlier I didn't have the 0Currency field in the flat file. But in my DSO while creating the key figures by default the 0Currency field also got created which is quite obvious. Now while activating the transformations I was getting a message that 'No source field for the field 0Currency'. Hence I had to create a new field in my flat file called 0Currency and load it with USD in all rows.
Please help me in loading this flat file into the DSO.
Thank you.
TR.Hi guys,
Thanks a lot for your answers. with your help I could see the data in the 'Read preview data' and schedule the load. I did use all the Info objects in the info objects column of the data source to load the flat file.
The data is in PSA successfully without any issues. but when I executed the DTP it failed with errors.
Earlier there was no mapping from Currency field in source to the all the key figure fields in the target in the transformation. The mapping was only from Currency to 0CURRENCY but still the transformation got activated. As per your advise I mapped Currency field to the remaining Key Figure fields but then I am getting the error
'Source parameter CURRENCY is not being used'
Why is that so?
list of Errors after executing the DTP:
1. 'Record filtered because records with the same key contain errors'
Message:
Diagnosis: The data record was filtered out becoz data records with the same key have already been filtered out in the current step for other reasons and the current update is non-commutative (for example, MOVE). This means that data records cannot be exchanged on the basis of the semantic key.
System Response: The data record is filtered out; further processing is performed in accordance with the settings chosen for error handling.
Procedure: Check these data records and data records with the same key for errors and then update them.
Procedure for System administration
Can you please explain this error and how should I fix this error.
2. Exception input_not_numeric; see long text - ID RSTRAN
Diagnosis: An exception input_not_numeric was raised while executing function module RST_TOBJ_TO_DERIVED_TOBJ.
System Response
Processing the corresponding record has been terminated.
Procedure
To analyse the cause, set a break point in the program of the transformation at the call point of function module RST_TOBJ_TO_DERIVED_TOBJ. Simulate the data transfer process to investigate the cause.
Procedure for System Administration
What does this error mean? How do I set a breakpoint in the program to fix this error inorder to load the data?
What does Procedure for System Administration mean?
Please advise.
Thank you.
TR. -
Problem Reading Flat File from Application server
Hi All,
I want to upload a Flat File which is having a Line Length of 3000.
While uploading it to Application server , only upto 555 length it is getting uploaded .
i am using the code :
DATA: BEGIN OF TB_DATA OCCURS 0,
LINE(3000),
END OF TB_DATA.
*----Uploading the flat file from the Local drive using FM "UPLOAD".
OPEN DATASET TB_FILENAM-DATA FOR OUTPUT IN TEXT MODE." ENCODING DEFAULT.
IF SY-SUBRC NE 0.
WRITE:/ 'Unable to open file:', TB_FILENAM-DATA.
ELSE.
LOOP AT TB_DATA.
TRANSFER TB_DATA TO TB_FILENAM-DATA.
ENDLOOP.
ENDIF.
CLOSE DATASET TB_FILENAM-DATA.
What could be the problem?
Could it be due to any Configuration Problem?
Waiting for your replies.
Thanks and Regards.
SukiYour code looks OK, but you may have touble displaying the full width. Try:
WRITE: /001 TB_FILENAM-DATA+555.
(And don't forget to append your ITAB after each read.)
Rob -
To read flat file from a unix server
We need to read a flat file from a Unix server, where our Database is located.
The location gets created correctly.
But while we are trying to import files from the location in Design Center , we get an error that "directory does not exists", although the directory has all the permissions.
Can someone please suggest how should we create the location so as it can read the files.
Please Reply ASAP......We have started Design Center on a local machine(Windows Machine) with uaer as repository owner of the server,
In the design center we can not sample the file till we import it,
can you please tell how to sampe the file without importing it.
Also a location pointing to server location gets easily created on the design center and the file module points to that location only, but when we try to import the file through that location, it says directory does not exists, although oracle user has all the read write permissions on the directory......
Please help! -
Error while reading raw file created in previous task using raw file destination
I am reading a flat file and creating an raw file using raw file destination. path of the raw file is in a variable. now I am reading the raw file using the rawfile source component. I am able to execute if the .raw file is available. But
when we deploy into the production the .raw will be created runtime, so the package is getting failed while evualting the variable which hold the path of the variable. since the .raw is not available i am not able to proceed further. it's in ssis 2008
r2
Error at Load Data [Raw File Source [401]]: File "c:\test123.raw" cannot be opened for reading. Error may occur when there are no privileges or the file is not found. Exact cause is reported in previous error message.
Error at Load Data [SSIS.Pipeline]: component "Raw File Source" (401) failed validation and returned error code 0x80004005.
Error at Load Data [SSIS.Pipeline]: One or more component failed validation.
Error at Load Data : There were errors during task validation.I am also using a raw file destination using a variable.
I have set DelayValidation = true on both the DataFlow task and even the Sequence Container.
I get the same error when I run the entire ssis package, however
when I run the individual container or individual task it runs without an error.
Also, something interesting is the error is not the same path as the variable name.
Warning: The system cannot find the file specified.
Error: File "C:\Users\MyName\AppData\Local\Temp\GUIDNumber\\RawFileName" cannot be opened for reading. Error may occur when there are no privileges or the file is not found. Exact cause is reported in previous error message.
The variable is "C:\Temp\ProjectName\RawFileName"
I have other RawFile sources in this same project, but only this one file is giving me grief.
Any other suggestions? Is this a bug?
Have you set an expression for connection string property of raw file? Is it based on variable/expression or configuration? If yes, check the value of variable/ expression or configuraton item at runtime by putting a breakpoint in the pre execute event of task
and make sure path value its getting is correct. It may be that path is getting a different value at runtime due to expression/configuration set for it.
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
Problem while reading the file from FTP server
Hi Friends,
I have a problem while fetching files from FTP server.
I used FTP_Connect, FTP_COMMAND function modules. I can able to put the files into FTP server.
but I cant able to pick the files from FTP server.
anyone have faced similar issues kindly let me know.
Thanks
GowrishankarHi,
try this way..
for reading the file using FTP you need to use different unix command ..
Prabhuda -
Problem While reading a file in text mode from Unix in ECC 5.0
Hi Experts,
I am working on Unicode Upgrade project of ECC5.0.
Here i got a problem with reading a file format which it does successfully in 4.6 and not in ECC5.0
My file format was as follows:
*4 000001862004060300000010###/#######L##########G/##########G/########
It was successfully converting corresponding field values in 4.6:
*4
00000186
2004
06
03
00000010
25
0
4
0
54.75
0
54.75
0.00
While i am getting some problem in ECC5.0 during conversion of the above line:
*4 000001862004060300000010###/#######L##########G/##########G/########
it was consider in the same # values.
I have used the following statement to open and read dataset.
OPEN DATASET i_dsn IN LEGACY TEXT MODE FOR INPUT.
READ DATASET i_dsn INTO pos_rec.
Thanks for your help.
Regards,
Gopinath Addepalli.Hi
You might be facing this problem because of uni code. So while opening or reading the file, there is a statement call ENCODING. Use that option and keep the code page which you want. Then the problem may be solved.
Thanks & Regards.
Harish. -
Error while reading a file from server
Mine is 9i database and 11.5.10.2 oracle apps server
I am trying to read PDF file from server, I used the below code
l_bfile:=bfilename('\u05\app\applmgr\11i\oraclecomn\admin\out\jamuna_server','o7742576.out');
if DBMS_LOB.FILEEXISTS(l_bfile) = 1 then
dbms_output.put_line( 'Exists!');
dbms_output.put_line(dbms_lob.getlength(l_bfile));
else
dbms_output.put_line( 'Not Exists!');
end if;
i used forward slashes (/) instead of backward slashes (\) too and i also made dbms_output.put_line(to_char(dbms_lob.getlength(l_bfile)));
while using dbms_log package , i got the below error .I am bit worried since I have created directory and gave sufficicent priviliges too.
Error report:
ORA-00604: error occurred at recursive SQL level 1
ORA-01460: unimplemented or unreasonable conversion requested
ORA-06512: at "SYS.DBMS_LOB", line 485
ORA-06512: at line 24
I am sure bfilename('\u05\app\applmgr\11i\oraclecomn\admin\out\jamuna_server','o7742576.out') is give BFILE which is empty.
Can you please suggest me what I can do to move further? what kind of priviliges it requires to point and read the file from server?
Please help me in this.
Thanks in advanceI suspect you are not using a directory object is the problem.
Here is the formal description of bfilename
http://download.oracle.com/docs/cd/B10501_01/server.920/a96540/functions12a.htm#SQLRF00610
Sybrand Bakker
Senior Oracle DBA -
Error while loading Flat file to the table (ORA-00936: missing expression)
lat file Hi Gurus
Receiving the following error while trying to load of flat file to the database :
ODI-1228: Task test_file_load (Integration) fails on the target ORACLE connection DEMO_STAGE.
Caused By: java.sql.SQLSyntaxErrorException: ORA-00936: missing expression
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:457)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:405)
at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:889)
at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:476)
at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:204)
at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:540)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:217)
at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:1079)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1466)
at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3752)
at oracle.jdbc.driver.OraclePreparedStatement.execute(OraclePreparedStatement.java:3937)
at oracle.jdbc.driver.OraclePreparedStatementWrapper.execute(OraclePreparedStatementWrapper.java:1535)
at oracle.odi.runtime.agent.execution.sql.SQLCommand.execute(SQLCommand.java:163)
at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:102)
at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:1)
at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:537)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
at java.lang.Thread.run(Thread.java:662)
The file which I have tried to load is : SRC_SALES_PERSON and teh table structure is
CREATE table "TRG_SALES_PERSON"(
"SALES_PERSON_ID" NUMBER(8,0) NOT NULL,
"FIRST_NAME" VARCHAR2(80),
"LAST_NAME" VARCHAR2(80),
"DATE_HIRED" VARCHAR2(80),
"DATE_UPDATED" DATE NOT NULL)
Knowledge module used are
LKM File to SQL
IKM SQL Incremental Update
We rae using ODI 11g R2 ...
Thanks and reallty appreciate any help in thsi regard.HI there,
I am facing the same issue while loading data from SRC_SALES_PERSON(flat file) to TRG_CUSTOMER.
I dont see any errors in the steps however the data is not laoded finally. Here are the sql commands
**On source**
select ID C11_ID,
LASTNAME C9_LASTNAME
from TABLE
/*$$SNPS_START_KEYSNP$CRDWG_TABLESNP$CRTABLE_NAME=SRC_SALES_PERSONSNP$CRLOAD_FILE=D:\Pratima\Softwares\ODI\ofm_odi_companion_generic_11.1.1.5.1_disk1_1of1[1]\demo\oracledi-demo\oracledi\demo\file/SRC_SALES_PERSON.txtSNP$CRFILE_FORMAT=FSNP$CRFILE_SEP_FIELD=0x0009SNP$CRFILE_SEP_LINE=0x000D0x000ASNP$CRFILE_FIRST_ROW=0SNP$CRFILE_ENC_FIELD=SNP$CRFILE_DEC_SEP=SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=IDSNP$CRTYPE_NAME=STRINGSNP$CRORDER=1SNP$CRLINE_OFFSET=1SNP$CRLENGTH=11SNP$CRPRECISION=11SNP$CRACTION_ON_ERROR=0SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=FIRSTNAMESNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=12SNP$CRLENGTH=50SNP$CRPRECISION=50SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=LASTNAMESNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=62SNP$CRLENGTH=50SNP$CRPRECISION=50SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=DATE1SNP$CRTYPE_NAME=STRINGSNP$CRLINE_OFFSET=112SNP$CRLENGTH=20SNP$CRPRECISION=20SNP$CR$$SNPS_END_KEY*/
On Target
insert into STAGING.C$_0TRG_CUSTOMER
C11_ID,
C9_LASTNAME
values
:C11_ID,
:C9_LASTNAME
The actual code at source fails however the step is in green.
Thanks in Advance,
Pratima
Maybe you are looking for
-
My new MacBook doesn't come with Bootcamp
I just got it yesterday. It came with Leopard. I tried to look for it in the installation DVD 1, but the folder doesn't exist. That's what I typed in //Volumes//Mac OS X Install Disk 1/System/Installation/Packages/
-
How do I stop "auto Renewal" purchases?
My daughter has used my iPad and downloaded app game. Now every week I get an automatic $2 taken out of my bank account from iTunes titled " automatic renewal"... How can I stop this from occurring or cancel this?
-
10.4.4 and Kernel Panic on MDD
Hi--I updated my system to 10.4.4 (from 10.4.3) and now it kernel panics during startup. Following the hardware test, fairly quickly after the "rotating gear" appears, the system panics. I did a fresh install on another disk in the system (using 10.4
-
Importing photos from my hard drive
We have just converted from a PC to a MAC and with us we have bought 20,000 photos. The apple store transfered them on to our HARD DISC 2. When we are importing the photos into iPhoto they seem to be making a copy of them on HARD DISC 1 (where the ap
-
Hi there, i am on the most current Version of iWeb, now i realized that my Blog RSS Feed is always empty, i click subscribe and Safari says there are no items, checked the XML File abd it is empty, my Blog has already 10+ Entries, but none of them ar