Region Report reloading with wrong data

Hi all,
I'm having an odd problem with APEX that I'm hoping somebody could help with.
I have a region report containing 30 text fields, VALUE_1 to 30. The report contains several Validation rules and when submitted with no errors it works fine. However, when any of the Validations picks up an error, the page reloads as expected with the inline error text showing fine...but the text in the VALUE_7 field has been copied into the VALUE_17 input field (so both fields are displaying the same text). The data in all of the remaining cells displays fine. I've checked all of the obvious report and column attributes but can't see anything wrong and I'm happy that the copied data in the VALUE_17 field is not coming from the database.
Can anybody point me in the direction of things to try? Can a report become corrupted or are there known report issues with my version of APEX? (v4.0.2.00.07).
A large amount of Googling hasn't shown any similar problems.
Many thanks,
~H.
PS: Apologies for the username, the Preferences option refuses to admit that I've changed it!

Hi Hinden,
Is your report based on a table or a collection? Normally if it is based on a table the report would revert back to the original values of the table, when you first went to this page, if the validation fails. If it is based on a collection then you need to check the code where you fill up your collection, my guess is you insert the wrong values there.
Regards,
Joni

Similar Messages

  • IPhoto imports photos with wrong dates even if the dates are fine on the camera

    Hi!
    When I import photos with iPhoto, sometimes it imports them with wrong dates, even the dates are fine on the camera. It puts dates such as 2032. Does anyone know how can I fix that. As far as I know there is no way to change dates of the photos.
    Thanks!

    well that is very confusing since if the date is correct on the camera it will be correct in iPhoto
    and as to
    As far as I know there is no way to change dates of the photos.
    Try looking through your iPhto menus - two commands - adjust time and date and batch change time and date - asjust is used to correct incorrect dates like a comera setting -   Batch change for missing dates like with scans
    LN

  • How to update a table (CUSTOMER) on a Report Server with the data from the same table (CUSTOMER) from another server Transaction server?

    I had an interview question that is:
    How to update a table (Customer) on a server ex: Report Server with the data from the same table (Customer) From another server ex: Transaction server?
    Set up steps so inset, update or delete operation takes place across the servers.
    It would be great if someone please enlighten me in details about this process in MS SQL Server 2008 R2.
    Also please describe would it be different for SQL Server 2012?
    If so, then what are the steps?

    I had an interview question that is:
    How to update a table (Customer) on a server ex: Report Server with the data from the same table (Customer) from another server ex: Transaction server?
    Set up steps so that inset, update or delete operation gets done correctly across servers.
    I was not sure about the answer, it would be great if someone please put some light on this and explain in details about this process in MS SQL Server 2008 R2.
    Also it would be very helpful if you please describe would it be different for SQL Server 2012? If so, then what are the steps?

  • Urgent BEX Report Showing the Wrong data

    Hi Experts,
    In my Cube the data is available curreclty, but where as my bex report showing the wrong data.
    like xeample:
    Infocube
    Material               Plant    Year    Units  Qty
    000000000002001032     003     2006     BAG   1,500
    But in bex i took the same selection's, but it displays as
    000000000002001032     003     2006     BAG   2
      i am Unable to trace this
    any help plzzz
    Anil

    Hi,
    since Quantity(units PC??) can not be in decimals,Bex will round off 1.5 to 2 in bex report.
    Use NODIM function i.e. NODIM(Qty) instead of Quantity.
    NODIM function is available under data functions of Formula builder.
    From context menu fo KF structure>new formula>NODIM(Qty).
    or
    try by setting Decimal places to  2 or 3 in KF properties in Bex Query.
    hope this helps.
    Message was edited by: Murali

  • Reports generated with null data

    Hi,
    I setup a role which has read access to all records under different tab corretly. However, the reports are displayed with null data. Any idea?
    Thanks VK

    Hi Bobb,
    "Role-Based Can Read All Records" equal to No in my configuration. Requirement is not to allow report creation on others data.
    So, if I make it Yes, then they can access all data in CRMOD and create report. Please let me know if my understanding is correct.
    Thanks VK

  • EWA Report collect the wrong data

    Hi all
    Our EWA Report can not collect data perfectly
    It says that
    When we checked the log files of the archive runs, we detected that your archive strategy does not follow the SAP archive recommendations.
    In the time period from 28.04.2009 to 25.05.2009 , we noticed the following problems:
    - There was neither a successful archive run nor a successful off-line backup on Friday 22.05.2009
    - There was neither a successful archive run nor a successful off-line backup on Thursday 21.05.2009
    - There was neither a successful archive run nor a successful off-line backup on Wednesday 20.05.2009
    - There was neither a successful archive run nor a successful off-line backup on Tuesday 19.05.2009
    - There was neither a successful archive run nor a successful off-line backup on Monday 18.05.2009
    There are 5 working days without successful archive run this week.
    In fact, the backup run smoothly, for that period. I think the system read the wrong period. On 5 May - 6 May 2009, backup error but from 7 May until now, backup run smoothly
    FYI : On 8 Mei 2009 we restart our production system
    The saposcol and sapccmsr run already
    but when I hit OS07N, there is no data on casapdb system, error "Partner casapdb_CAP_32 not reached (reading table)
    Can anybody help to to solve my problem?
    thanks
    regards,
    Della

    well for backups as you say.....
    SAP looks for offline backups every day and since you are not doing offline backups,so it says an error during EWA report,so you must not be too much concerned on backup error in EWA report,
    what wrong data is EWA collecting apart from this
    Rohit

  • When importing, creates folder with wrong date

    I found a curious bug in Lightroom 4.2.
    If I have pictures taken after 23:00, Lightroom creates an import folder in the next day, but the dates in the files are correct.
    Example:
    The file's date is: 29/11/2012 23:19.
    However if I hover my mouse over the image in the import dialog:
    The date that appears is 30/11/2012 00:19. Here's the detail:
    After you import, you get the image in the wrong import folder, but the meta on the image is correct:
    I've started to notice this before, but always thought of a wrong date on my camera. Just noticed that this happens with my camera, iPhone, and any other photo that has a time after 23:00.
    Do you know if there is a solution for this?
    Thank you in advance,
    Manuel Azevedo

    Does your computer time match with your camera time? Summer-/winter-time apart?
    The reason, why I am asking this: the map module auto-assumes that these times match (rather than asking you in which UTC-time zone your image time stamps should be interpreted when tagging from a gpx-track).
    Could be that something similar is happening in import module if you let LR auto-create destination folders based on date-time.
    I have no experience with this as I prefer to explicitely give the names of the new destination folders myself.
    Cornelia

  • (Please note) - Report - Bluetooth with Enhanced Data Rate Software II for Windows 7 wipe main drive

    All,
    There have been reports that when upgrading the Bluetooth with Enhanced Data Rate Software II for Windows 7 to version 6.4.0.1700 is causing a wipe on the main OS drive (commonly C:\)
    Version:
    6.4.0.1700
    Release Date:
    2011/05/16
    Affected units which uses this version of driver are as follows.
    Support models ThinkPad L420, L421
    ThinkPad L520
    ThinkPad T420, T420i, T420s, T420si
    ThinkPad T520, T520i
    Thinkpad W520
    ThinkPad X1
    ThinkPad X220, X220i, X220 Tablet, X220i Tablet
    ThinkPad Edge E220s
    ThinkPad Edge E420, E420s
    ThinkPad Edge E520
    Issue has been raised to engineering and the team are currently working on it.
    ****Please do not update the Bluetooth driver until further notice****
    We are in a process of pulling the driver off the Support Site and ThinkVantage System Update
    Main thread discussion here
    //JameZ
    Check out the Community Knowledge Base for hints and tips.
    Did someone help you today? Press the star on the left to thank them with a Kudo!
    If you find a post helpful and it answers your question, please mark it as an "Accepted Solution"!
    X240 | 8GB RAM | 512GB Samsung SSD
    Solved!
    Go to Solution.

    All,
    Updated drivers have been released to the web team and should be published in the next couple of days.
    Will update this thread once they are live - the official solution will be to use the newer drivers.
    mark
    ThinkPads: S30, T43, X60t, X1, W700ds, IdeaPad Y710, IdeaCentre: A300, IdeaPad K1
    Mark Hopkins
    Program Manager, Lenovo Social Media (Services)
    twitter @lenovoforums
    English Community   Deutsche Community   Comunidad en Español   Русскоязычное Сообщество

  • 10g Reports issue with XML Data Source

    Hi,
    Has anybody ever encountered an issue with Oracle 10g report using an XML as the data source? What happens is, some of the values in the XML are printed to the wrong column.
    One of the elements in our XML file is a complex type with 10 elements under it. The first 5 are picked up properly, but the last 6 are not. Elements #6 to #9 has a minimum occurence of 0. What happens is when element #6 is present, but #7 is, the value for element #7 is passed on to element #6.
    The XSD and XSL files are both valid since the reports were working when we were still using 9i. There is no hidden logic in the report which might cause this issue to come up, i.e., the report just picks up the values from the XML and prints it to the appropriate columns.
    Any help will be greatly appreciated.

    XSD used
    <?xml version="1.0" encoding="UTF-8"?>
    <xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" elementFormDefault="qualified">
            <!-- trade instructions detail & trailer -->
            <xs:element name="TradeDetail">
                    <xs:complexType>
                            <xs:sequence>
                                    <xs:element ref="TradeType"/>
                                    <xs:element ref="TradeID"/>
                                    <xs:element ref="TradeDate"/>
                                    <xs:element ref="FundID"/>
                                    <xs:element ref="FundName"/>
                                    <xs:element ref="DollarValue" minOccurs="0"/>
                                    <xs:element ref="UnitValue" minOccurs="0"/>
                                    <xs:element ref="PercentageValue" minOccurs="0"/>
                                    <xs:element ref="OriginalTradeID" minOccurs="0"/>
                                    <xs:element ref="CancellationFlag"/>
                            </xs:sequence>
                    </xs:complexType>
            </xs:element>
            <xs:element name="Instruction">
                    <xs:complexType>
                            <xs:sequence minOccurs="0">
                                    <xs:element ref="TradeDetail" maxOccurs="unbounded"/>
                            </xs:sequence>
                    </xs:complexType>
            </xs:element>
            <!-- overall trade instruction message -->
            <xs:element name="InterchangeHeader">
                    <xs:complexType>
                            <xs:sequence>
                                    <xs:element ref="Instruction"/>
                            </xs:sequence>
                    </xs:complexType>
            </xs:element>
            <!-- definition of simple elements -->
            <xs:element name="FundID" type="xs:string"/>
            <xs:element name="TradeType" type="xs:string"/>
            <xs:element name="TradeID" type="xs:string"/>
            <xs:element name="TradeDate" type="xs:string"/>
            <xs:element name="FundName" type="xs:string"/>
            <xs:element name="DollarValue" type="xs:decimal"/>
            <xs:element name="UnitValue" type="xs:decimal"/>
            <xs:element name="PercentageValue" type="xs:decimal"/>
            <xs:element name="OriginalTradeID" type="xs:string"/>
            <xs:element name="CancellationFlag" type="xs:string"/>
    </xs:schema>
    XML used
    <?xml version = '1.0' encoding = 'UTF-8'?>
    <InterchangeHeader xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="TradeInstruction.xsd">
       <Instruction>
          <TradeDetail>
             <TradeType>Purchase</TradeType>
             <TradeID>M000038290</TradeID>
             <TradeDate>20061201</TradeDate>
             <FundID>ARO0011AU</FundID>
             <FundName>ABN Fund</FundName>
             <DollarValue>2111.53</DollarValue>
             <CancellationFlag>N</CancellationFlag>
          </TradeDetail>
          <TradeDetail>
             <TradeType>Redemption</TradeType>
             <TradeID>M000038292</TradeID>
             <TradeDate>20061201</TradeDate>
             <FundID>ARO0011AU</FundID>
             <FundName>AMRO Equity Fund</FundName>
             <UnitValue>104881.270200</UnitValue>
             <CancellationFlag>N</CancellationFlag>
          </TradeDetail>
          <TradeDetail>
             <TradeType>ISPurchase</TradeType>
             <TradeID>M000038312</TradeID>
             <TradeDate>20061201</TradeDate>
             <FundID>MLC0011AU</FundID>
             <FundName>Cash Fund</FundName>
             <OriginalTradeID>M000038311</OriginalTradeID>
             <CancellationFlag>N</CancellationFlag>
          </TradeDetail>
       </Instruction>
    </InterchangeHeader>
    XSLT used
    <?xml version="1.0" encoding="UTF-8"?>
    <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
            <xsl:output method="xml" version="1.0" encoding="UTF-8" indent="yes"/>
            <xsl:template match="/">
                    <InterchangeHeader>
                            <xsl:for-each select="InterchangeHeader/Instruction/TradeDetail">
                            <xsl:sort select="FundName"/>
                            <xsl:sort select="TradeDate"/>
                                    <TradeDetail>
                                            <TradeType><xsl:value-of select="TradeType"/></TradeType>
                                            <TradeID><xsl:value-of select="TradeID"/></TradeID>
                                            <TradeDate><xsl:value-of select="TradeDate"/></TradeDate>
                                            <FundID><xsl:value-of select="FundID"/></FundID>
                                            <FundName><xsl:value-of select="FundName"/></FundName>
                                            <DollarValue><xsl:value-of select="DollarValue"/></DollarValue>
                                            <UnitValue><xsl:value-of select="UnitValue"/></UnitValue>
                                            <PercentageValue><xsl:value-of select="PercentageValue"/></PercentageValue>
                                            <OriginalTradeID><xsl:value-of select="OriginalTradeID"/></OriginalTradeID>
                                            <CancellationFlag><xsl:value-of select="CancellationFlag"/></CancellationFlag>
                                    </TradeDetail>
                            </xsl:for-each>
                    </InterchangeHeader>
            </xsl:template>
    </xsl:stylesheet>

  • Report Output with Key Date

    Dear All,
    I have the following fields in DSO.
    Key Part: EmployeeID, DateTo, DateFrom.
    Data Part: Designation, AnnualSalary.
    I have data in the DSO as follows:
    EmployeeID DateTo        DateFrom      Designation AnnualSalary
    1                 12.11.2008 01.01.2008    Worker        10000
    1                 31.12.9999 13.11.2008    Officer         20000
    2                 14.11.2008  01.01.2008    Worker        10000
    2                 31.12.9999  15.11.2008    Officer        20000
    Now If I give key date as 13.11.2008.
    The Output of the Report should be like this:
    EmployeeID Designation AnnualSalary
    1                 Officer         20000
    2                 Worker        10000
    How can I achive this ??
    Cheers,
    Neel.

    Hi,
    I think you can achieve this by using a formula with boolean operators "is less than" and "is greather than", and then also use row/columns settings and/or conditions to suppress rows that does not contain the key date.
    BR,
    Niclas

  • BW Report mismatching with R3 data

    Hi,
    I am facing very strange issue.For one of BW Reports -we are reporting -billing doc data with details like quantity,net value in BW report.While for some of biiling doc quantity is matching with SAP table-VBRK /VF03.There is no restrcition at query level only formula is used for getting Billing Qty.
    I also tried with test query with no formula but there also some bIlling doc qty not matching with R3.Also billing qty value coming till Infocube is Correct and matches with R3 vALUE.
    CAN Any one help please.Quick help will be Appreciated !!
    Thanks,
    Nilesh

    >
    npathak wrote:
    > Hi,
    >
    > I am facing very strange issue.For one of BW Reports -we are reporting -billing doc data with details like quantity,net value in BW report.While for some of biiling doc quantity is matching with SAP table-VBRK /VF03.There is no restrcition at query level only formula is used for getting Billing Qty.
    > I also tried with test query with no formula but there also some bIlling doc qty not matching with R3.Also billing qty value coming till Infocube is Correct and matches with R3 vALUE.
    >
    > CAN Any one help please.Quick help will be Appreciated !!
    >
    > Thanks,
    > Nilesh
    Any unit conversions done ?
    or
    Aggregates missing rollup ?

  • Report Builder with multiple data series

    I am not able to figure out how to create a bar chart using
    multiple data series in the Report Builder. I can do it with
    explicit CFML coding but not in the Report Builder. I am running MX
    version 7 with the current (as of 9/1/07) download of the Report
    Builder.
    My table contains visit information by site, with standard
    demographics on each visit (age, race, gender, etc.) I am trying to
    analyze the data by site, for example, race by site.
    I am using an SQL statement to sort and group by site &
    race. I can get a chart of the number of visits for each site or
    the total number of visits by race across all sites. But I can't
    get race by site. The SQL statement looks like this:
    SELECT race, site, COUNT(*) as counter
    FROM visit_info WHERE userid = #session.uid#
    GROUP BY site, race ORDER by site, race
    Every variation I have tried for showing race by site either
    leaves out part of the results or creates a bar chart that has the
    race and site categories as separate entities (White, Black,
    Chinese, Site1, Site2, Site3, etc). I've also tried creating a
    combined variable (site_race) but that doesn't seem to fare any
    better.
    Any help would be appreciated.

    Pop,
    Badunit's example is one possible arrangement that you might have described; that is, one column for X values and several columns for Y values. This is the case when you should respond that you are sharing X-values. With all the data in one table, you can select it all at once, so you don't have the problem with dragging additional series into the chart.
    The Numbers User Guide PDF's first chapter is the place to start getting your bearings - you will learn how to distinguish Header vs. Body rows and columns. It's worth the effort to take another look at it.
    Since you are using a Scatter Chart, it's important not to use a header column for X because X values must be Numeric, as opposed to Text.
    Jerry

  • Tax error - audit data - table ETXDCJ with wrong data

    Hi,
       I am working on audit_data file that external tax system receives from SAP. table entries in 'ETXDCJ' are fed to audit_data file which are wrong. Please let me know how this table gets the entries. Any transaction or steps that I need to know to replicate the issue. We have this wrong tax data issue only for certain sales document type not for all. Please let me know.
    Thanking you
    Ram.

    External tax calculation is a setting that you have to enable in configuration. Every document has a tax procedure and pricing procedure assigned to it. So when you create a sales order, based on these two your prices and taxes are calculated. I am not how it works, but I think the external tax calculating software will calculate and send tax documents back to SAP. These are the ones that get stored in the table ETXDCJ,ETXDCH and ETXDCI. This update function module seems to be inserting it EXTERNAL_TAX_DOC_INSERT_DB.
    What is it that you are calling error? Are calculations wrong, or is it something else? Are you sure it is document type not cutomer or state that is influencing this? Look in config for the external tax calculation under 'Financial Accoutning Global Settings' as follows:
    Implementation Guide for R/3 Customizing (IMG)
    -->Financial Accounting
       -->Financial Accounting Global Settings
          -->Tax on Sales/Purchases
             -->External Tax Calculation
    I am wondering if your issue is with the tax jurisdictions and your condition table entries. If you are sure that it is always with a particular document type, compare the document type settings with another one that is working.
    If this helps, please don't forget to reward.
    Srinivas

  • Importing old pictures with wrong date stamp. After adjusting date they are landing in the correct folder of referenced masters.

    I was thrilled to learn that with Aperture I could have one big library and using referenced masters, I can organize the pictures by date so I know where everything is and can backup to the cloud.  I started importing really old photos that were taken with god knows what camera and the date is wrong.  Since I had all these in date structured folders I could make a best guess on the date - not needing it to be exact, just so it shows up close enough in projects and the original lands in the right folder for future use & backup. I have come to learn that changing the metadata is only referring to how Aperture sees it.  In other words, images that were taken on 10/31/04 are still landing in the referenced masters folder on the date the camera said.  I now understand that the metadata date is not the date Aperture needs fixed.  Is the best way to fix this to delete the images from Aperture, find some tool that will adjust to the approximate real date, then reimport.  If so, can someone recommend a utility that does this? If I can fix this inside of Aperture, even better.  I saw on one of the discussions: "it's all about the Exif and IPTC and not about the file metatata."  Please feel free to learn me on that.  Cheers.

    That answers some of it - the relocate files.  That will allow me to put the files in the right year/month folder.  I imported full folders into Aperture so need to edit the metadata date for multiple pictures at the same time.  I understand when you select multiple pictures and then ajust time, it adjusts in relation to the first picture.  In my case these things were taken by all sorts of crappy cameras & re-edited years later (2003 data).  So pictures taken on the same day have 2001, 2003 & 2007 dates. I really do just wnt to set all of them to the same (ballpark) date. 
    So my goals are:
    1) have the pictures land in the right referenced folder - answered
    2) do a mass adjust time to the same date for a folder of images
    3) have the projects then land chronologically inside Aperture - not sure about this one yet
    Thanks for your help, it's very much appreciated.

  • Report Model with mulitple data sources

    Hi,
    We have requirement to develop report model in 2008 having multiple data sources which includes SQL SERVER 2012 and Oracle database so end user can all the relevant data in one place to run in report builder. So please advise how this can be achieved.
    One option I can think of pulling all the data into one database through SSIS and then build a smdl on it.
    I heard it might be possible to build Tabular cube and then use Report builder on it not sure which is the best option unless any other solution.
    Thanks,

    You can use SSIS to collect data from multiple sources (SQL Server, Oracle, etc.) and store it in SQL Server database. Then you can use SSAS to build a cube on that database. With SSRS/reportbuilder you can create reports on the cube or on the database...
    Please mark the post as answered if it answers your question | My SSIS Blog:
    http://microsoft-ssis.blogspot.com |
    Twitter

Maybe you are looking for