CSV file no data found

Hi all
I am new to APEX development. please help in below
I have two report pages (interactive reports). Let say
Page1 : Department Page (List of Departments) link to Page2
Page 2: List of Employee for selected Department - PageItem (P_DEPARTMENT)
this report table has source as (select * from emp where department_id = : P_DEPARTMENT)
Now when i click 'Download' and select 'Excel' format -> I can see data is getting downloaded for Page1
but not for Page2 for this page..when i download i see blank data in excel sheet.
I have seen couple of old threads on this same TOPIC
Re: Strange caching error in exporting a report to csv (AppEx v2.0)
http://apex.oracle.com/pls/otn/f?p=31517:1:4124351071686
Suggestion is "compute your items on load - before header. This means to set the items to a kind
of a default value they should have at the start"
I was not very clear on above statement and can any one suggest me how to proceed to solve this issue
i.e how to 'compute your items on load - before header' I am trying to export data in PAGE2 into an excel file.
THanks

The threads you found a regarding session state typically in Classic Reports - and if this is a parameter coming from another and you already see data in your IR on the page, then you have a different problem.
And your other problem may be trickier to track - a search on "oracle apex blank interactive report csv" found these
download IR to excel produces an empty report -- automatic csv encoding - what's your language?
blank csv from interactive report -- process removing data - unlikely in your case, but see how different these issues are?
Can you set up a test case on apex.oracle.com?
Scott
blog: [url grassroots-oracle.com]grassroots-oracle.com
twitter: [url twitter.com/swesley_perth]@swesley_perth

Similar Messages

  • How to search a .csv file for data using its timestamp, then import to labview

    Hi, I'm currently obtaining density, viscosity and temperature data from an instrument, adding a timestamp and writing it to a .csv file which I can view in Excel. This works fine (see attached code) but what I need to do now is to search that csv file for data which was obtained at a certain time, import the temperature, density & viscosity values at this time back into Labview to do some calculations with them, while the data acquisition process is still ongoing.
    I've found various examples on how to import an entire csv file into labview, but none on how to extract data at a specific time. Also, whenever I try to do anything with the .csv file while my data acquistion VI is running, I receive error messages (presumably because I'm trying to write to and import data from the .csv file simultaneously). Is there some way around this, maybe using case structures?
    If you need to know my skill level, I've been using Labview for a few weeks and prior to that have basically no experience of writing code, so any help would be great. Thanks!
    Solved!
    Go to Solution.
    Attachments:
    Lemis VDC-30 read registers MODBUS v5.vi ‏56 KB

    It sounds as if you are going about this a little backwards writing to a data file and then extracting from the file but its the weekend so I can't think of an improved way to do it at the moment. 
    Searching for a specific time with those specific values is quite easy, or if you wanted to select any time then you could interpolate the values to find any value that you want (This is where the contiguous measurement comes in, as you have readings at discrete times you will have to interpolate the values if you want to get the 'measured value' at a time point that is not exactly one of your measured points).
    If you can extract the TDMS time column and the T, D & V then simply thresholding and interpolating each of your array/data sets should allow readings at your desired times.
    Attachments:
    Interpolate.png ‏301 KB

  • Import Comments data and Dimension Members from csv file via Data Manager

    Dear Experts,
    I have two questions regarding the data manager.
    Q1.Is it possible to import "Comments" from the csv file via Data Manager?
    We'd like to import the amount with "Comments".
    My image of csv file is like below;
    ACCOUNT,CATEGORY,TIME,ENTITY,INPUTCURRENCY,AMOUNT,COMMENTS
    1100000,ACTUAL,2010/06,LC,30000,This is comment
    Q2.Is it possible to import the dimension "members" from the csv file via Data Manager?
    We have a user-defined dimension named "Project"
    and would like to import the members, instead of maintaining them in BPC administration manually.
    I found an online help information which says "Import Master Data from a Data File Example",
    but I could not find any relevant sample package for this.
    (I tried to import the members by using "Import" package, but it failed...)
    reference:http://help.sap.com/saphelp_bpc75/helpdata/en/86/8b1bfc12c94fb0b585cca70d6f1b61/content.htm
    Thanks in advance for your help.
    Fumi

    Hi Fumi,
    In this case, I would suggest you to create a customized SSIS package which will fill-in the "Comment<APP>" table, according to the csv file you have. I do not know any standard package that allows you to import comment the way you would like...
    Best Regards,
    Patrick

  • Performance issue with big CSV files as data source

    Hi,
    We are creating crystal reports for a large banking corporation with CSV files as data source. For some reports, we need join 2 csv files. The problem we met now is that when the 2 csv files are very large (both >200M), the performance is very bad and it takes an hour or so to refresh the data in Crystal Reports designer. The same case for either CR 11.5 or CR 2008.
    And my question is, is there any way to improve performance in such situations? For example, can we create index on the csv files? If you have ever created reports connecting to CSV, your suggestions will be highly appreciated.
    Thanks,
    Ray

    Certainly a reasonable concern...
    The question at this point is, How are the reports going to be used and deployed once they are in production?
    I'd look at it from that direction.
    For example... They may be able to dump the data directly to another database on a separate server that would insulate the main enterprise server. This would allow the main server to run the necessary queries during off peak hours and would isolate any reporting activity to a "reporting database".
    This would also keep the data secure and encrypted (it would continue to enjoy the security provided by an RDBMS). Text & csv files can be copied, emailed, altered & deleted by anyone who sees them. Placing them in encrypted .zip folders prevents them from being read by external applications.
    <Hope you liked the sales pitch I wrote for you to give to the client... =^)
    If all else fails and you're stuck using the csv files, at least see if they can get it all on one file. Joining the 2 files is killing you performance wise... More so than using 1 massive file.
    Jason

  • 2.5 GB CSV file as data source for Crystal report

    Hi Experts,
        I  was asked to create a crystal report using crystal report as datasource(CSV file that is pretty huge (2.4Gb)). Could you help with me any doc that expalins the steps mainly with data connectivity.
    Objective is to create Crystal Report using that csv file as data source, save the report as .rpt with the data and send the results to customer to be read with Crystal Reports Viewer or save the results to PDF.
    Please help and suggest me steps as I am new to crystal reports and CSV as source.
    BR, Nanda Kishore

    Nanda,
    The issue of having some records with comma and some with a semi colon will need to be resolved before you can do an import. Assuming that there are no semi colons in any of the text values of the report, you could do a "Find & Replace" to convert the semi colons to commas.
    If find & replace isn't an option, you'll need to get the files separately.
    I've never used the Import Export Wizzard myself. I've always used the BULK INSERT command
    It would look something like this...
    BULK INSERT SQLServerTableName
    FROM 'c:\My_CSV_File.csv'
    WITH (FIELDTERMINATOR = ',')
    This of course implies that your table has the same columns, in the same order as the csv files and that each column is the correct data type to accept the incoming data.
    If you continue to have issues getting your data into SQL Server Express, please post in one of these two forums
    [Transact-SQL|http://social.msdn.microsoft.com/Forums/en-US/transactsql/threads]
    [SQL Server Express|http://social.msdn.microsoft.com/Forums/en-US/sqlexpress/threads]
    The Transact-SQL forum has some VERY knowledgeable people (including MVPs and book authors) posing answers.
    I've never posed to the SQL Server Express but I'm sure they can trouble shoot your issues with the Import Export Wizard.
    If you post in one of them, please copy the post link back to this thread you I can continue to to help.
    Jason

  • Data Filter on CSV file using Data Synchronization

    gOT error When i used Data Filter on CSV file in Data Synchronization task, Filter condition : BILLINGSTATE LIKE 'CA'
    TE_7002 Transformation stopped due to a fatal  error in the mapping. The expression [(BILLINGSTATE LIKE 'CA')] contains  the following errors [<<PM Parse Error>> missing operator  ... (BILLINGSTATE>>>> <<<<LIKE 'CA')].

    Hi,
    Yes,This can be done through BEx Broadcaster.
    Please follow the below stes...
    1.Open your query in BEx Analyzer
    2.Go to BEx Analysis Toolbar->Tools->BEx Broadcaster...
    3.Click "Create New Settings"->Select the "Distribution Type" as "Broadcast Email" and "Output Format"  as "CSV"
    4.Enter the Recipients Email Address under "Recipients" tab
    5.Enter the Subject and Body of the mail under "Texts" tab
    6.Save the Setting and Execute it.
    Now the Query data will be attached as a CSV file and sent to the recipents through Email.
    Hope this helps you.
    Rgds,
    Murali

  • CSV output  No data found

    Hello,
    My HTMLDB report displays data, but when a CSV file is generated, 'No data found' displays. Any ideas??

    Yes the above mentioned Computation worked for me.
    Here's what I did on the page in question.
    Create a new Page Computation
    Type: Static Assignment
    Computation Point: Before Header
    Computation: ALL
    Condition Type: Value of Item in Expression 1 is NULL
    Expression 1: P8_FILTER_BY_TEAM (the name of the page item that needs the default session state set.)
    The conditional is important because if a person changes the value and submits the page we want to hold that new value in session state and filter the result set accordingly. However, if the page item is NULL then we're simply extending the display default value to also be the session state default for P8_FILTER_BY_TEAM.

  • Export to CSV Option - no data found

    Hi
    I have an app with various report regions that work fine on-line. They typically take one or more values from select lists as input to the where clause, and have summary processing based on first or first and second columns. It seems that some reports export to csv just fine, but others produce a csv file with just the 'no data found' message ? Any ideas ?
    cheers
    Jules

    Hi Chris
    I think I understand what happens now - i.e. the CSV export runs in a separate session from the on-line page. Your suggestion has fixed the proplem - by having a button to redirect / resubmit back to the same page the csv export then picks up the right data.
    However I do not understand why as this seems no different to having a one or more parameter fields defined as 'select list with re-direct', which I do. Any further explanation appreciated.
    Jules

  • Create csv file for data in tables

    Hi All,
    I need to "export" data for about 60 tables in one of my databases to a csv file format using a "|" as a separator.
    I know I can do this using a query like:
    select col1 || '|' || col2 || '|' || col3 from table;
    Some of my tables have more than 50 columns so I'm guessing there is an easier way to do this than to construct select SQL statements for each table?
    Thanks in advance.

    I would point out that the OP did not identify the target for the files so it could be an non-Oracle database or Excel in which case external tables would not work since Oracle only writes to external tables in datapump format.  If the target is another Oracle database then external tables would be an option.  If external tables are an option then insert/select over a database link would be a potential alternate approach to using csv files.
    I use a SQL script to generate the Select statement to create my csv files.
    set echo off
    rem
    rem SQL*Plus script to create comma delimited output file from table
    rem
    rem 20000614  Mark D Powell   Automate commonly done task
    rem
    set pagesize  0
    set verify    off
    set feedback  off
    set linesize  999
    set trimspool on
    accept owner    prompt 'Enter table owner => '
    accept tblname  prompt 'Enter table name => '
    spool csv2.sql
    select 'select ' from sys.dual;
    select decode(column_id,1,column_name,
                 '||''|''||'||column_name)
    from   sys.dba_tab_columns
    where  table_name = upper('&&tblname')
    and    owner      = upper('&&owner')
    order by column_id;
    select 'from &&owner..&&tblname;'
    from   sys.dual;
    spool off
    undefine owner
    undefine tblname
    HTH -- Mark D Powell --

  • Why dose Numbers 3.0 while opening csv-Files converts Dates into Integers?

    I want to open a csv-file with a date.
    The Date is in german notation.
    28.10.2013 the value in the cell is 40113.
    This problem occures only in Numbers 3.0
    if I open the same file with Numbers 2.3 it is ok.
    How can I change it.
    The Systempreferences of the date and time handling is ok.

    2. Even though I have set the data type to text within EPMA, once I deploy the application and check it with the EAS console, the data type is NUMERIC - NOT TEXT anymore. This blows my mind. I don't know why this happens.
    Its common ...As enter a text wavlue which has been entered in planning as text in number...Their are certain table's in planning which play role while fetching such values....
    Cheers!
    Sh!va

  • How to change csv file to dat file

    Hi, 
    I want ask you about the data that i got.
    Actually, the data in csv file. So, can i change directly file to dat file? Just put c:\Trace.dat?
    can or not? 
    Need your help.
    Thank you.
    Solved!
    Go to Solution.
    Attachments:
    csv.JPG ‏108 KB

    NurHidayah wrote:
    Actually this format in MATLAB software. So, i want this dat format because I want to simplify my data imported directly into the software.
    And what is the format that is defined?  Or are you referring to MAT files, which are binary files specific to Matlab?
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • 2008R2 FOREACH csv file - move data to db - move file to folder weirdness

    Greetings everyone,
    I'm facing a weird problem. So I'm retrieving files from an ftp server and saving them in a local 'to process' folder. Then I use a foreach file iterator to process those files and save the data in the database through a data flow task. When this operation
    finishes, I move the file in 2 possible folders: Processed or Error. This depend on the result of the dataflow task:
    To ensure my loop continues if an error should occur i've placed and event handler on the OnError action of the DFT. There in the system variables I set the value for PROPAGATE to false. This way there isn't and error escalation that would fire the package
    onerror event handler. Now all works well when all my files are correct regarding the content.
    For testing purposes I''ve created a dummy data file that would fire a truncation error. When I run the package and the irerator arrives at the BAD dummy data file, the file gets placed in the error folder as expected and I get the correct error message
    (truncation error). However when the next file, which is correct, processes in the DFT, it also produces an error:
    Error: SSIS Error Code
    DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "DS_STG" failed with error code 0xC0202009. There may be error messages posted before this with more information on why
    the AcquireConnection method call failed.
    I don't understand this. It seems like the previous error has a corrupting impact on the DFT. Is there a property that I have to set here? I've tried playing with the MaximumErrorCount and tried to run it with a configuration file to persist the connection
    manager information, but the behaviour still persists. 
    So in summary: the iteration continues until the end, but after one bad file is being processed in the DFT, all the next good files get the connection manager error...
    I can think of ways to workaround this issue, but I would think that this should work as it is, no? :) 
    Thanks for the answers in advance.

    Hi Visakh,
    I specify the folder which holds the csv files. Then I assign each file path to a variable
    Then I use this variable (path) as connection string in the DFT flat file source:
    I created a workaround which I don't like but it seems to do the job. I kept the original foreach file to determine which file is valid BUT I don't do the data insert in this DFT no more. I deleted the ole db destination so it generates an error only if
    the data doesn't come through at the DFT. 
    Then I remove the bad files from the to process folder. I copy the foreach iterator with all the components, but now I add the ole db destination in the data flow task for data insert. But at this time there are only correct files in the 'to process' folder.
    This works, but it isn't 'pretty' :) 
    Do you have an idea what could be wrong? It seems one bad file corrupts the destination connection. 
    When you say bad file is it metadata which is corrupted?
    Please Mark This As Answer if it solved your issue
    Please Mark This As Helpful if it helps to solve your issue
    Visakh
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Can I use Office 2007 csv file for data load?

    Hi all,
    I see some special characters while looking at data after load.
    I am using the latest version of microsoft office.
    Does SAP BW support the csv files saved in 2007 version?
    Thanks and Regards,
    Ravi.

    Hi Ravi,
    This is another error.
    Error message from the source system
    Diagnosis
    An error occurred in the source system.
    System response
    Caller 09 contains an error message.
    Further analysis:
    The error occurred in Extractor .
    Refer to the error message.
    Procedure
    How you remove the error depends on the error message.
    Note
    If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
    Error Message:
    Error 1 when loading external data
    Error when opening the data file
    Errors in source system.
    Step By step analysis:
    Data Request sent off?
    Data Selection sucessfully started?
    Data se;ection sucessfully finished?
    Processing error in source system reported?
    Processing error in warehouse reported?
    Could you please guide me houw to overcome these. Thanks.

  • Importing CSV file with Data Merge Fails

    Specs
    See pasted text from CSV at http://pastebin.com/mymhugpN
    I am using InDesign CS6 (8.0.1)
    I created the CSV by downloading it from a Google Spreadsheet as a CSV. I confirm with the Terminal that the character encoding is utf-8 usnig the file command.
    Problem detailed
    I am trying to import a CSV file (utf-8) with Data Merge via the Select Data Source... command with Show Import Options checked. When viewing the Data Source Import Options dialog, I set the following options—Delimiter:Comma, Encoding:Unicode, Platform:Macintosh. I leave Preserve Spaces in Data Source unchecked. It fails to import any variables and produces no error message. I have tried other CSV files as well (created TextEdit, Espresso, etc.) and it seems that InDesign will not import any files if Unicode is specified as the encoding, no matter which other options are specified.
    Can anyone else confirm this?
    Importing as ACSII works, but obviously does not display my content correctly.

    Mike is having some trouble posting in this thread (and I am too), but he sent me a PM with what he wanted to say:
    OK. I think I might have a positive answer for you.
    I was getting lost in the upper ASCII characters you showed. In your test file I never could see any--a case of not seeing the trees for the forest.
    Your quote marks are getting dropped in your test file. Now, this may or may not affect other factors but it does in some further testing. I believe ID has an issue with dropping quote marks even in a plain ASCII file if the marks are at the beginning of a sentence and the file is tab delimited. Call it a bug.
    Because of all the commas and quote marks in your simple file, I think you should be exporting from Google Docs' spreadsheet as a tab-delimited file. This exported file has to be opened in a text editor capable of saving it out as a UTF-16 BE (Big Endian) type of file.
    Also, I think you are going to have to use proper quote marks throughout, or change them in the exported tab-delimited file. Best to have a correct source, though.
    Here is your sample ZIPped up. I think it works properly. But then again, I think I might be bleary-eyed by now.
    http://www.wenzloffandsons.com/temp/merge_psalms_utf-16.zip
    Take care, Mike

  • Negative Numbers considered as VARCHAR while uploading CSV file under Data Load in APEX

    I am trying to upload a CSV file which contains negative numbers. These negative numbers are being considered as VARCHAR2 while uploading. And if I change the Column Type to Numbers the upload fails.
    Any solutions the problem will highly be appreciated.

    select * from nls_database_parameters
    where parameter = 'NLS_NUMERIC_CHARACTERS'
    shows you which characters your database believes represent the decimal place and the thousands separator. So, if your database expects 1,234.123 and you present a 'number' as 1 234,123 it will complain.
    Also, in your case... your comma-separated-values contain commas? Or are the individual fields enclosed with quotes?

Maybe you are looking for

  • Ical crashes on startup in 10.5.1

    Hi everyone, I upgraded to 10.5.1 today and everything went swimmingly. I then tried to launch iCal and immediately got a crash and error message (see below). I tried several additional times to launch iCal, even after rebooting, always with the same

  • I have an Apple id and an icloud idea, do they work together?

    I just got a new laptop, I downloaded icloud and it told me to make an icloud acount. My apple account is just a user name not an e-mail. icloud wnated me to use an e-mail for the username so now I have two accounts, and apple id and an icloud id. Ca

  • No "Shared" Folder in iPhone 4 ; iPod. OS 4.3

    No "Shared" Folder in iPhone 4 ; iPod. OS 4.3 Hi, For some reason I can't seem to get the "Shared" folder under "More" on the iPod app via iPhone 4. I'm trying to take advantage of the ability to use my iTunes library on my PC on my iPhone. I had fol

  • File to sync bapi_multimapping_without bpm

    Hi All, I have done a file ->bapi -> response to file testing interface using the adapter modules. My actual scenario is : Source file has multiple lines and for each line i have to call a bapi. The response of the BAPI needs to be appended in a file

  • When I open firefox the page says "Invalid header recieved from client."

    when i open Firefox i get an error message on the screen saying "Invalid header received from client. This happens with Internet explorer too but not with any of my other web browsers. How do i fix this? == This happened == Every time Firefox opened