Barcode CODE 128 with large number (being rounded?) (BI / XML Publisher 5.6.3)

After by applying Patch 9440398 as per Oracle's Doc ID 1072226.1, I have successfully created a CODE 128 barcode.
But I am having an issue when creating a barcode whose value is a large number. Specifically, a number larger than around 16 or so digits.
Here's my situation...
In my RTF template I am encoding a barcode for the number 420917229102808239800004365998 as follows:
<?format-barcode:420917229102808239800004365998;'code128c'?>
I then run the report and a PDF is generated with the barcode. Everything looks great so far.
But when I scan the barcode, this is the value I am reading (tried it with several different scanner types):
420917229102808300000000000000
So:
     Value I was expecting:     420917229102808239800004365998
     Value I actually got:         420917229102808300000000000000
It seems as if the number is getting rounded at the 16th digit (or so, it varies depending of the value I use).
I have tried several examples and all seem to do the same.  But anything with 15 digits or less seems to works perfectly.
Any ideas?
Manny

Yes, I have.
But I have found the cause now.
When working with parameters coming in from the concurrent manager, all the parameters define in the concurrent program in EBS need to be in the same case (upper, lower) as they have been defined in the data template.
Once I changed all to be the same case, it worked.
thanks for the effort.
regards
Ronny

Similar Messages

  • How to Capture a Table with large number of Rows in Web UI Test?

    HI,
    Is there any possibility to capture a DOM Tabe with large number of Rows (say more than 100) in Web UI Test?
    Or is there any bug?

    Hi,
    You can try following code to capture the table values.
    To store the table values in CSV :
    *web.table( xpath_of_table ).exportToCSVFile("D:\exporttable.csv", true);*
    TO store the table values in a string:
    *String tblValues=web.table( xpath_of_table ).exportToCSVString();*
    info(tblValues);
    Thanks
    -POPS

  • Lookups with large number of records do not return the page

    Hi,
    I am developing an application using Oracle JHeadstart 10.1.3 Preview Version 10.1.3.0.78
    In my application I created a lookup under domains and used that lookup for an attribute (Display Type for this attribute is: dropDownList) in a group to get the translation fro this attribute. The group has around 14,800 records and the lookup has around 7,400 records.
    When I try to open this group (Tab), the progress shows that it is progressing but it does not open even after a long time.
    If I change the Display Type for the attribute from dropDownList to textInput then it works fine.
    I have other lookups with lower number of records. Those lookups work fine with dropDownList Display Type.
    Only I have this kind of problem when I have a lookup with large number of records.
    Is there any limitation of record number for lookups under Domains?
    How I can solve this?
    I need to translate the attribute (get the description from another table using the code).
    Your help would be appreciated.
    Thanks
    Syed

    We have also faced similar issue, but us, it was happening when we were using the dropDownList in a table, while the same dropDownList was working in table format. In our case the JVM is just used to crash and after google'ing it here in forums, found that it might be related to some JVM issue on Windows XP machines without Service Pack 2.
    Anyway... the workaround that we taken to get around the issue is to use LOV instead of a dropDownList in your jHeadStart.
    Hope this helps...
    - rutwik

  • Barcode code 128 scanning problem

    Hi,
      I am using a code 128 type barcode in smartforms and
       currently I am working on ECC 6.0.
      Bar Code Symbology   Code 128
      Bar Code Alignment   Normal
      Narrow Module Width  09
      Linear Height        00130
      Code128 Mode:        A
      Bar Code Symbology   Code 128
    Bar Code Alignment   Normal
    Narrow Module Width  09
    Linear Height        00300
    Code128 Mode:        A
      But scanner is unable to read the barcodes eventhough i tried playing with the width and height of barcode.
      Pls. let me know how to fix this issue.
    Thanks.

    Hi,
    I have a similar setup. I'm having problem with the width of the barcode. Even if the setup is 1 cm height and 4cm. In preview it is showing correctly but once it is printed the width is only 2.3 cm.
    Any idea what is wrong?

  • FR Layout issue with large number of columns

    Hi!
    I'm developing a report in FR 11.1.1.3 with over 30 columns.
    The issue is that when I run the report in web preview, the dropdown of dimension in page goes to the far right and disappears from the display.
    If I reduce the number of the columns I don't have this problem.
    I've already tried to maximize the workspace to the maximum without any result.
    Can anyone help me to deal with reports with large numbers of columns?
    Regards,
    Luís
    Edited by: luisguimaraes on 13-Mar-2012 06:48

    IE8 could be the reason. According to the supported platform matrices (http://www.oracle.com/technetwork/middleware/bi-foundation/oracle-hyperion-epm-system-certific-2-128342.xls), check tab EPM System Basic Platform, row 70, in order IE8 to work, FR and Workspace should be patched.
    FR Patch number: 9657652
    Workspace Patch number: 9314073
    Patches can be found on My Oracle Support. Just search for the patch number.
    Cheers,
    Mehmet

  • TableView performance with large number of columns

    I notice that it takes awhile for table views to populate when they have a large number of columns (> 100 or so subjectively).
    Running VisualVM based on CPU Samples, I see that the largest amount of time is spent here:
    javafx.scene.control.TableView.getVisibleLeafIndex() 35.3% 8,113 ms
    next is:
    javfx.scene.Parent$1.onProposedChange() 9.5% 2,193 ms
    followed by
    javafx.scene.control.Control.loadSkinClass() 5.2% 1,193 ms
    I am using JavaFx 2.1 co-bundled with Java7u4. Is this to be expected, or are there some performance tuning hints I should know?
    Thanks,
    - Pat

    We're actually doing some TableView performance work right now, I wonder if you could file an issue with a simple reproducible test case? I haven't seen the same data you have here in our profiles (nearly all time is spent on reapplying CSS) so I would be interested in your exact test to be able to profile it and see what is going on.
    Thanks
    Richard

  • BUG: Last Image of Large Number Being Moved fails

    This has happened several times in organizing some folders.  Moving over 100 images at a time, it seems that one image near the end fails - I get the screen that Lightroom can't move the image right now.  It's always just one image.  I can move it on it's own just a second later and it works just fine.
    While the Move operation is being fixed, consider that it could go way faster than it does now if the screen didn't have to be refreshed after each file has been moved.  I can see the value of the refresh if it's just a few images being moved, but for a large number, the refresh isn't helpful anyhow.
    Paul Wasserman

    I posted on this last week, and apparently a number of people have experienced this.
    http://forums.adobe.com/thread/690900
    Please report it on this bug report site so that it gets to the developers' attention sooner:
    https://www.adobe.com/cfusion/mmform/index.cfm?name=wishform
    Bob

  • SSO with large number of users

    Hi,
    We want to implement SSO using user mapping because we have different user ids from system to system.
    We have large number of users in our system, how can we implement user mapping.
    Is there anyway to write a program so that it can take care of user mapping, if yes? can you please give overview so that i can dig in to it.
    Thanks,
    Damodhar.

    Hi Damodhar
    User mapping can be done in the programming level. The User Management Engine in EP 6.0 provides two interfaces to access the user mapping data namely
    1. IUserMappingService.
    2. IUserMappingData.
    You can implement these two intefaces to enable User Mapping. Please refer to the following link for further details.
    http://help.sap.com/saphelp_nw04/helpdata/en/69/3482ee0d70492fa63ffe519f5758f5/content.htm
    Hope that was helpful.
    Best Regards
    Priya

  • To find total number of pages in XML publisher

    1. Is there any method to find the total number of pages in BI or XML publisher.
    2. Is there any method to to repeat the column(Not rows) on every page.For eg you have table with two columns say "A" and "B" .I want to repeat this B in every page . And A should print only in first page.
    3. Is there any method to print a text say "ABC" only on first page footer based on condition.
    Thanks
    Wazid

    Exactly what my requirement is- In my template i have set header and footer page setup as "different first page'" and also printing the user text at the bottom of the last page using <?start@last-page-first:body?> <?end body?>.When i run the template, for multiple pages output user text is displaying at the bottom of the last page that is working fine but for single page output user text is not displaying at the bottom of the page because of the ""different first page'" header and footer.If you have any idea please suggest.

  • Number format issue in XML Publisher from OAF (',.' is replaced by 'u n')

    Hi All,
    I am facing a problem in XML Publisher report. In report I need to display some number fields with 'USD' format. In RTF, data type 'Number' and format '###,##0.00' is selected. If I run concurrent program from oracle core forms using System Administrator > Concurrent > Request, I am able to get correct number format.
    Ex : 123456.00 After Number format : 123,456.00
    Also its working fine from XML Publisher Administrator responsibility (using Preview).
    But problem is when I submit the concurrent request through OAF, I am not getting correct data. ',' is replace by 'u' and '.' is replace by 'n'.
    Ex: 123456.00 In Report 123u456n00
    Checked profile value : ICX: Numeric Character and its set to 1,000.00 at site level.
    Do I need to set character set anywhere in OAF before calling concurrent program?
    I tried to use "alter session set nls_numeric_character = ',.';" by calling stored procedure from OAF. But still its not working.
    Please give me solution for this.
    Regards,
    Sadanand

    Hell - I am running into the same issue? Did you guys find a resolution for this? Please let me know, would really appreciate your help.
    Thanks,
    Dhiraj
    [email protected]

  • Slow record selection in tableView component with large number of records

    Hi experts,
    we have a Business Server Page (flow logic) with several htmlb:inputField's. As known from SAP standard we would like to offer value helper (F4) to the users for the ease of record selection.
    We use the onValueHelp() method of the inputField to open a extra browser window through JavaScript. In the popup another html-website is called, containing a tableView component with all available records. We use the SINGLESELECT mode for the table view.
    Everything works perfect and efficient, unless the tableView contains too many entries. If the number of possible entries is large the whole component performs very very slow. For example the selection of the record can take more than one minute. Also the navigation between pages through the buttons at the bottom of the component takes a lot of time. It seems that the tableView component can not handle so many entries.
    We tried to switch between stateful and stateless mode, without success. Is there a way to perform the tableView selection without doing a server-round-trip? Any ideas and comments will be appreciated.
    Best regards,
    Sebastian

    Hi Raja,
    thank you for your hint. I took a look at sbspext_table/TableViewClient.bsp but did not really understand how the Java-Script coding works. Where is the JavaScript code in that example? Which file, does it contain.
    Meanwhile I implemented another way to evite the server round trip.
    - Switch page mode of the popup window to "Stateful"
    - Use OnInitialization method like OnCreate (as shown in [using OnInitialization like OnCreate])
    - Limit the results of the SELECT statement with UP TO 1000 ROWS
    Best regards,
    Sebastian

  • Filter item limits - search not returning any results with large number of elements otherwise ok

    Hi,
    We are working through a problem we've encountered with Azure Search. We are building a filter string based on an "id eq 'xxx' or id eq 'ccc' or id eq 'vvv' etc. The id's are provided in a collection and we loop through, building the string until it's
    ready to apply.
    We are using 2015-02-28 preview at the moment.
    We are encountering a situation where, after approximately 20 id's Azure Search doesn't return any results, nor does it appear to return any error code. I'm pretty sure that the url length is less than 8K.
    Is there any limit on the number of filter elements in a query?

    We followed up offline.
    The symptom in this case was a 200 response with no body. The underlying cause is a URL parsing bug that tries to interpret colons in the query string as the delimiter of a URL scheme (like https:), but with a hard length limit of 1KB. We will work
    on a fix for both the underlying URL parsing issue and the issue that caused it to surface as a body-less 200.
    In the meantime, the workaround is to put colons as close to the beginning of the URL query string as possible. Specifically, putting $filter and facets first, and putting expressions with colons within those first, will mitigate this in most cases.
    Note that the .NET SDK puts $filter and facets near the beginning of the query string by default, so if you're consuming Azure Search you might want to give it a try:
    http://www.nuget.org/packages/Microsoft.Azure.Search/

  • File Bundle with large number of files failed

    Hi!
    Well, I thought there will appear problems. We do have some apps for distribution just by copying large amount of files (not large in size) to Windows (XP Pro, usually) machines. These is some programs which works from directory wo any special need for installation. Happy situation for admin. From one side. In ZfD 4.0.1 we did install this app on one of machines and then did take snapshot via special app (who remember) and did copy file to (Netware) server share, give rights for device (~ workstation) and associate it with ws via eDir and ... voila, next restart or whatsoever and app was there. Very nice, indeed, I miss this!
    So, I tried to make this happen on ZCM 10 (on SLES 11). Did app, sorry, bundle, upload files (first time it stuck, second time id accomplish, around 7500 files) and did distribution/launch association to ws (~device). And ... got errors. Several entries in log as examples below.
    Any ideas?
    More thanks, Alar.
    Error: [1/8/10 2:41:53 PM] BundleManager BUNDLE.UnknownExceptionOccurred An Unknown exception occurred trying to process task: Novell.Zenworks.AppModule.LaunchException: Exception of type 'Novell.Zenworks.AppModule.LaunchException' was thrown.
    at Novell.Zenworks.AppModule.AppActionItem.ProcessAct ion(APP_ACTION launchType, ActionContext context, ActionSetResult previousResults)
    Error: [1/8/10 2:41:54 PM] BundleManager ActionMan.FailureProcessingActionException Failed to process action: Information for id 51846d2388c028d8c471f1199b965859 has not been cached. Did you forget to call CacheContentInfo first?

    ZCM10 is not efficient in handling that number of files in a single
    bundle when they are in the content repo.
    Suggestions include zipping the files and uploading to the content repo
    and then downloading and extracting the zip as part of the bundle.
    Or Use the "Copy Directory" option to copy the files from a Network
    Source Directly like you did in ZDM.
    On 1/8/2010 8:56 AM, NovAlf wrote:
    >
    > Hi!
    > Well, I thought there will appear problems. We do have some apps for
    > distribution just by copying large amount of files (not large in size)
    > to Windows (XP Pro, usually) machines. These is some programs which
    > works from directory wo any special need for installation. Happy
    > situation for admin. From one side. In ZfD 4.0.1 we did install this app
    > on one of machines and then did take snapshot via special app (who
    > remember) and did copy file to (Netware) server share, give rights for
    > device (~ workstation) and associate it with ws via eDir and ... voila,
    > next restart or whatsoever and app was there. Very nice, indeed, I miss
    > this!
    > So, I tried to make this happen on ZCM 10 (on SLES 11). Did app, sorry,
    > bundle, upload files (first time it stuck, second time id accomplish,
    > around 7500 files) and did distribution/launch association to ws
    > (~device). And ... got errors. Several entries in log as examples
    > below.
    > Any ideas?
    > More thanks, Alar.
    > ---
    > Error: [1/8/10 2:41:53 PM] BundleManager
    > BUNDLE.UnknownExceptionOccurred An Unknown exception occurred trying to
    > process task: Novell.Zenworks.AppModule.LaunchException: Exception of
    > type 'Novell.Zenworks.AppModule.LaunchException' was thrown.
    > at Novell.Zenworks.AppModule.AppActionItem.ProcessAct ion(APP_ACTION
    > launchType, ActionContext context, ActionSetResult previousResults)
    > ---
    > Error: [1/8/10 2:41:54 PM] BundleManager
    > ActionMan.FailureProcessingActionException Failed to process action:
    > Information for id 51846d2388c028d8c471f1199b965859 has not been cached.
    > Did you forget to call CacheContentInfo first?
    > ---
    >
    >

  • Can't Empty Trash With Large Number of Files

    Running OS X 10.8.3
    I have a very large external drive that had a Time Machine backup on the main partition. At some point, I created a second partition, then started doing backups on the new partition. On Wed, I finally got around to doing some "housecleaning" tasks I'd been putting off. As part of that, I decided to clean up my external drive. So... I moved the old, unused and unwanted Backups.backupdb that used to be the Time Machine backup, and dragged it to the Trash.
    Bad idea.
    Now I've spent the last 3-4 days trying various strategies to actually empty the trash and reclaim the gig or so of space on my external drive.  Initially I just tried to "Empty Trash", but that took about four hours to count up the files just to "prepare to delete" them. After the file counter stopped counting up, and finally started counting down... "Deleting 482,832 files..." "Deleting 482,831 files..." etc, etc...  I decided I was on the path to success, so left the machine alone for 12-14 hours.
    When I came back, the results were not what I expected. "Deleting -582,032 files..."  What the...?
    So after leaving that to run for another few hours with no results, I stopped that process.  Tried a few other tools like Onyx, TrashIt, etc...  No luck.
    So finally decided to say the **** with the window manager, pulled up a terminal, and cd'ed to the .Trash directory for my UID on the USB volume and did a rm -rfv Backups.backupdb
    While it seemed to run okay for a while, I started getting errors saying "File not found..." and "Invalid file name..." and various other weird things.  So now I'm doing a combination of rm -rfing individual directories, and using the finder to rename/cleanup individual Folders when OSX refuses to delete them.
    Has anyone else had this weird overflow issue with deleting large numbers of files in 10.8.x? Doesn't seem like things should be this hard...

    I'm not sure I understand this bit:
    If you're on Leopard 10.5.x, be sure you have the "action" or "gear" icon in your Finder's toolbar (Finder > View > Customize Toolbar).  If there's no toolbar, click the lozenge at the upper-right of the Finder window's title bar.  If the "gear" icon isn’t in the toolbar, selectView > Customize Toolbar from the menubar.
    Then use the Time Machine "Star Wars" display:  Enter Time Machine by clicking the Time Machine icon in your Dock or select the TM icon in your Menubar.
    And this seems to defeat the whole purpose:
    If you delete an entire backup, it will disappear from the Timeline and the "cascade" of Finder windows, but it will not actually delete the backup copy of any item that was present at the time of any remaining backup. Thus you may not gain much space. This is usually fairly quick
    I'm trying to reclaim space on a volume that had a time machine backup, but that isn't needed anymore. I'm deleting it so I can get that 1GB+ of space back. Is there some "official" way you're supposed to delete these things where you get your hard drive space back?

  • Generating table with large number of columns (256)

    Hi,
    I don't know if this is right place for posting this:
    for data mining purposes I need a table which column names needs to be size N, where N is 2, 3 or 4. Column name is build upon alphabet of nucleotids A,C,T,G and all variations with repetition. For N=2, columns are: AA, AC, AT, AG, CA, CC, CT, CG, TA, TC, TT, TG, GA, GC, GT, GG (4^2 = 16). For N=3, 4^3 = 64, and for N=4 4^4 = 256.
    Primary key in table is array of nucleotids, and values in previously mentioned columns for each nucleotid array are 1 or 0, based on that if such nucleotid of size N exists in that chain or not.
    My questions are:
    1) is there any tool (in Oracle Data Miner or whatever) which can generate such table from array of nucleotids (one array of nucleotids is ordinary string)?
    2) It's not a problem to generate this table myself but is it "normal" to have database table with 257 columns (when N = 4)?
    I hope that my problem is clear (because of my English).
    Thanks in advance.
    Regards.

    Without knowing the reason for doing so, I would guess, this kind of design is quite poor.
    How are you planning to specify your queries? (I do know dynamic SQL would be a possibility ...).
    Another point of view:
    What if someone decides to have column names built using 5 nucleotids? This would exceed the maximum number of columns of a table which currently (10g, 11g) is 1000?

Maybe you are looking for

  • Why do i keep getting "image is missing preview data" when trying to move files from bridge to PS ??

    I am not able to move multiple files to PS...regardless whether I am doing HDR or panorama.....get this "image is missing preview data" with the first file number of the series of files. #missingpreviewdata

  • Problem for saving files. docx in word 2011 for mac

    I'm promemas to save documents in Office 2011. The files have an error in the name or path. Could someone help me? Thank you.

  • Is a lasso tool use for a part of design?

    i have need of the part of design to take away. but i cant do it . i tried for the lasso,but not possible . is there any tool work ?

  • Consistent db

    Hi After restoring from hotbackup and applying some archivelogs, how can I figure out that the database is consistent so that I can open the db? Does applying only one archivelog make the whole database consistent? and during media recovery, even the

  • Bad sound audiobooks

    hey before ther vere no problems by lisening too soundbooks om my pc .but now when im lisening too an audiobook from cd bay using ithuns i have a bad sound . when im using msi dvd program ther is no probleme pleace help my lars