Performance issues when working with huge lists

I’ve got a script that reads a large CSV spreadsheet
and parses the data into a list of the form [[A1,B1,C1],
[A2,B2,C2], [A3,B3,C3]] and a second list of the form
[#A1:B1,#A2:B2,#A3:B3] etc… The actual spreadsheet is about
10 columns x 10,000 rows. Reading the file string goes fast enough,
the parsing starts off fast but slows to a crawl after about 500
rows (I put the row count on the stage to check progress). Does
anyone know if the getaProp, addProp, and append methods are
sensitive to the size of the list?
A sample of one of the parsing loops is below. I’m
aware all interactivity will stop as this is executed. This script
is strictly for internal use, it crunches the numbers in two
spreadsheets and merges the results to a new CSV file. The program
is intended to run overnight and the new file harvested in the
morning.

> Does anyone know if the getaProp, addProp, and
> append methods are sensitive to the size of the list?
Is this a trick question? Sure they are. All of them.
Addprop and append are quite fast (due to the list object
scalable
preallocating memory as required), so i doubt that they are
the cause of
the problem.
GetAProp will search each item in the list, therefore, if you
are
searching for the last item, or if the item is not in the
list, the more
the items, the slower the command.
Didn't go through all your code but I noticed
- this: repeat with rowCount = 2 to file2string.line.count
Big no-no! Line counting is a very slow operation for it to
be a
evaluated in a loop.
- and this: myFile2data.append(myLineData)
String operations like this require memory reallocation, and
therefore
are very slow. If you do conclude that such an operation
causes the
problem, consider using a preallocated buffer (create a big
string in
advance) and then use
mydata.char.[currentoffset..(currentoffset+newstr.length)]=newstr
This can make code run even hundreds times faster, compared
to the
append method.
Applied CD wrote:
> I?ve got a script that reads a large CSV spreadsheet and
parses the data into a
> list of the form [[A1,B1,C1], [A2,B2,C2], [A3,B3,C3]]
and a second list of the
> form [#A1:B1,#A2:B2,#A3:B3] etc? The actual spreadsheet
is about 10 columns x
> 10,000 rows. Reading the file string goes fast enough,
the parsing starts off
> fast but slows to a crawl after about 500 rows (I put
the row count on the
> stage to check progress). Does anyone know if the
getaProp, addProp, and
> append methods are sensitive to the size of the list?
>
> A sample of one of the parsing loops is below. I?m aware
all interactivity
> will stop as this is executed. This script is strictly
for internal use, it
> crunches the numbers in two spreadsheets and merges the
results to a new CSV
> file. The program is intended to run overnight and the
new file harvested in
> the morning.
>
> writeLine("File 2 Data Parsing" & RETURN)
> myOrderColumn = myHeaders2.getOne("OrderNum")
> myChargesColumn = myHeaders2.getOne("Cost")
> myFile2data = []
> mergedFedExCharges = [:]
> repeat with rowCount = 2 to file2string.line.count
> myLineData = []
> repeat with i = 1 to
file2string.line[rowCount].item.count
> myItem = file2string.line[rowCount].item
> if i = 1 then
> myItem = chars(myItem,2,myItem.length)
> end if
> myLineData.append(myItem)
> end repeat
> if myLineData.count = myHeaders2.count then
> myFile2data.append(myLineData)
> myOrderSymbol = symbol("s" &
myLineData[myOrderColumn])
> myCurrentValue =
getaProp(mergedFedExCharges,myOrderSymbol)
> if voidP(myCurrentValue) then
> mergedFedExCharges.addProp(myOrderSymbol,0.00)
> end if
> mergedFedExCharges[myOrderSymbol] =
mergedFedExCharges[myOrderSymbol] +
> value(myLineData[myChargesColumn])
> writeUpdate(myLineData[1])
> else
> writeError("file 2 : " & string(myLineData) &
RETURN)
> end if
> end repeat
>

Similar Messages

  • How to get around a performance issue when dealing with a lot of data

    Hello All,
    This is an academic question really, I'm not sure what I'm going to do with my issue, but I have some options.  I was wondering if anyone would like to throw in their two cents on what they would do.
    I have a report, the users want to see all agreements and all conditions related to the updating of rebates and the affected invoices. From a technical perspective ENT6038-KONV-KONP-KONA-KNA1.  THese are the tables I have to hit.  The problem is that when they retroactively update rebate conditions they can hit thousands of invoices, which blossoms out to thousands of conditions...you see the problem. I simply have too much data to grab, it times out.
    I've tried everything around the code.  If you have a better way to get price conditions and agreement numbers off of thousands of invoices, please let me know what that is.
    I have a couple of options.
    1) Use shared memory to preload the data for the report.  This would work, but I'm not going to know what data is needed to be loaded until report run time. They put in a date. I simply can't preload everything. I don't like this option much. 
    2) Write a function module to do this work. When the user clicks on the button to get this particular data, it will launch the FM in background and e-mail them the results. As you know, the background job won't time out. So far this is my favored option.
    Any other ideas?
    Oh...nope, BI is not an option, we don't have it. I know, I'm not happy about it. We do have a data warehouse, but the prospect of working with that group makes me whince.

    My two cents - firstly totally agree with Derick that its probably a good idea to go back to the business and justify their requirement in regards to reporting and "whether any user can meaningfully process all those results in an aggregate". But having dealt with customers across industries over a long period of time, it would probably be bit fanciful to expect them to change their requirements too much as in my experience neither do they understand (too much) technology nor they want to hear about technical limitations for a system etc. They want what they want if possible yesterday!
    So, about dealing with performance issues within ABAP, I'm sure you must be already using efficient programming techniques like using Hash internal tables with Unique Keys, accessing rows of the table using Field-Symbols and all that but what I was going to suggest to you is probably look at using [Extracts|http://help.sap.com/saphelp_nw04/helpdata/en/9f/db9ed135c111d1829f0000e829fbfe/content.htm]. I've had to deal with this couple of times in the past when dealing with massive amount of data and I found it to be very efficient in regards to performance. A good point to remember when using Extracts that, I quote from SAP Help, "The size of an extract dataset is, in principle, unlimited. Extracts larger than 500KB are stored in operating system files. The practical size of an extract is up to 2GB, as long as there is enough space in the filesystem."
    Hope this helps,
    Cheers,
    Sougata.

  • Performance issue while working with large files.

    Hello Gurus,
    I have to upload about 1 million keys from a CSV file on the application server and then delete the entries from a DB table containing 18 million entries. This is causing performance problems and my programm is very slow. Which approach will be better?
    1. First read all the data in the CSV and then use the delete statement?
    2. Or delete each line directly after reading the key from the file?
    And another program has to update about 2 million entries in a DB table containing  20 million entries. Here I also have very big performance problems(the program has been running for more the 14 hours). Which is the best way to work with such a large amount?
    I tried to rewrite the program so that it will run parallel but since this program will only run once the costs of implementing a aRFC parallization are too big. Please help, maybe someone doing migration is good at this
    Regards,
    Ioan.

    Hi,
    I would suggest, you should split the files and then process each set.
    lock the table to ensure it is available all time.
    After each set ,do a commit and then proceed.
    This would ensure there is no break in middle and have to start again by deleteing the entries from files which are already processed.
    Also make use of the sorted table and keys when deleting/updating DB.
    In Delete, when multiple entries are involved , use of  an internal table might be tricky as some records may be successfully deleted and some maynot.
    To make sure, first get the count of records from DB that are matching in Internal table set 1
    Then do the delete from DB with the Internal tabel set 1
    Again check the count from DB that are matching in Internal table set 1 and see the count is zero.
    This would make sure the entire records are deleted. but again may add some performance
    And the goal here is to reduce the execution time.
    Gurus may have a better idea..
    Regards
    Sree

  • Adobe Bridge permission issues when working with network account

    We have some users using Adobe Bridge CS3, CS4 and CS5.
    They are using Bridge to browse arround the file server.
    I have ACL's so the users can have full access using the Finder, deleting, making folders etc.
    Users are logged in to their AD account with their home folder synced to a Mac OS X server 10.6.4.
    Unless they are Posix owner of a folder they can't make changes or nothing using Bridge.
    If they log in to a local account on a Mac and then manually connect to a share on the same server all is fine with Bridge. They can make folders, delete files etc etc.
    Any ideas?
    I know Adobe made it easy for themselfs years ago by simply stating it is not supported to work on a server using any Adobe app.

    jhellstrom wrote:
    We have some users using Adobe Bridge CS3, CS4 and CS5.
    They are using Bridge to browse arround the file server.
    I have ACL's so the users can have full access using the Finder, deleting, making folders etc.
    Users are logged in to their AD account with their home folder synced to a Mac OS X server 10.6.4.
    Unless they are Posix owner of a folder they can't make changes or nothing using Bridge.
    If they log in to a local account on a Mac and then manually connect to a share on the same server all is fine with Bridge. They can make folders, delete files etc etc.
    Any ideas?
    I know Adobe made it easy for themselfs years ago by simply stating it is not supported to work on a server using any Adobe app.
    As you said Adobe washed their hands of the matter and as any business customer is almost certainly going to use a server, and as Adobe products are so expensive pretty much only business customers can afford them, this is totally inexcusable.
    I had a similar network login related problem with Acrobat Pro in version 7.0 which after +two years+ (no exaggeration) was eventually fixed in Acrobat Pro 8.1. Unfortunately it was then broken again in Acrobat Pro 9.0. Based on my experiences with Acrobat Pro and due to a totally different reason (nothing to to do with using Adobe SW) I found that when I switched from using AFP for network home directories to instead using NFS, my problems with Adobe software and network login accounts went away.
    So, it might be worth your thinking about switching to using NFS for home directories as a workaround.
    Adobe have become even worse than Microsoft for their software. They use a variety of sucky installers instead of Apple's free Installer (Microsoft have switched to using Apple's Installer), they use product activation, they are even worse than MS Office at working with servers. The only redeeming fact Adobe has is that their Mac and Windows products are mostly equivalent unlike Microsoft who still cripple their Mac software.

  • Bluetooth issue when working with Safari but not other applications

    I have tried two new bluetooth headsets (Motorola 9HD and Sony DR-BT100cx) that both can Pair with and work just fine with the local itunes application and any DVD I use. But these devices do not work with Safari. I tried going to Hulu, YouTube, etc and i can tell the bluetooth device is 'listening' because I can hear it click or wait for a signal. But I never hear anything. If i switch over to DVD or my Podcasts that are local, it works fine. If I go to bluetooth settings and select 'stop using headset' the MacBook speakers pick up right away.
    Note: I am a brand new Mac user, so please disregard my OS version I have selected. I have no idea how to find my OS version, nor do I know how to install Firefox to see if it is simply a browser issue.
    Thanks,
    Jay

    I have the same issue. I've tried 3 different btooth headsets, and they all react the same:
    - music: ok
    - skype: ok
    - webcasts / video (safari or firefox): no sound, and video stops
    HOWEVER: same system, same headset, using Internet Explorer in my windows xp VM: everything works.
    Anyone know how to contact Apple and log this issue?

  • Getting null returned when working with Linked Lists

    Ok, i'm having the same problem with a couple of methods that deal with linked lists, and i was hoping someone could lend me a hand. First off, the specifications for one of the methods:
    An iterative procedure smallElements
    PARAMETERS: a ListItem reference, ls
    an integer n
    RETURN VALUE: a new list identical to the given list, except
    that it contains no occurrences of numbers greater than n.
    for example, given input list ( 3 2 6 3 4 ) and 3,
    the return value would be the list ( 3 2 3 )
    And here is my code:
    ListItem smallElements(ListItem ls, int n){
    ListItem small = null;
    ListItem result = small;
    if(ls == null)
    return null;
    else{
    while(ls!=null){
         if(ls.number <=n){
         result = new ListItem(ls.number, null );
         result = result.next;
         ls = ls.next;
    return small;
    Like the topic says, i keep getting null returned as a value. I have tried setting small= new ListItem(ls.number, null), and that actually returns the correct list, except that the first number is repeated twice. I would greatly appreciate any assistance.

    I am not sure I understand your code. What exactly are those ListItems? It seems to me that you are dealing with single List elements, while the specification says that you are supposed to return a List.
    But the main error is that you have two ListItem objects there, which seems to fill the same purpose - only that you use one, and return the other. 'small', which is the one you return, never get set to anything else than null.
    I think you should do something like this: make a new, empty list to return
    for element in parameterlist
        if number is smaller than n
            add this element to returnlist
    return returnlist

  • Win7-64 Photoshop CC blank Save As dialog, then crash, when working with video / frame animations

    Hi, I recently updated to CC from CS6 and am having issues when working with PSDs containing video layers, or are set up as frame animations:  When attempting to Save As, the dialog will be completely blank except for the Save and Cancel buttons.   If I press cancel, then return to Save As, CC then often crashes.
    Here is the Error Event from Event Viewer:
    Faulting application name: Photoshop.exe, version: 14.0.0.0, time stamp: 0x5176451b
    Faulting module name: MediaCoreIF.DLL, version: 7.0.0.0, time stamp: 0x51573a21
    Exception code: 0xc0000005
    Fault offset: 0x00000000001fca98
    Faulting process id: 0x750
    Faulting application start time: 0x01ce9e1925a5045b
    Faulting application path: C:\Program Files\Adobe\Adobe Photoshop CC (64 Bit)\Photoshop.exe
    Faulting module path: C:\Program Files\Adobe\Adobe Photoshop CC (64 Bit)\MediaCoreIF.DLL
    Report Id: db6d5f8f-0a0c-11e3-9df9-0026b9cc01d1
    So far I am only having these issues if I have been working with video layers and then frame animations.  Frame animations in particular seem to cause instability as I can often work with video layers all day without a problem.  Then if I import video frames to layers, or open a PSD containing a frame animation, the Save As may not necessarily work.
    Sometimes closing and restarting Photoshop will allow me to save again, but then sometimes a whole system reboot is required.  It is only temporary though.
    I am not running antivirus software or other apps aside from what starts at boot.

    Frustrating that this thread has petered out, since I'm having the exact same issue. Did you ever find a way to improved stability?
    Also of note, the issue followed me across machines. Two weeks ago I purchased a new machine, figured if that didn't solve the problem nothing would... It didn't. :\
    The only similar factors are that the machines both are 64bit Windows 7 (a fresh OEM install on the new box) and have nVidia-brand graphics cards. (Again, though - not the SAME graphics card, picked up a new one with the new machine.) No drive cloning was involved: this is completely new hardware, a fresh factory install of Windows 7, different antivirus (went from AVG to Avast), and new download of Photoshop CC from CreativeCloud.
    Looks like our event details are quite similar, too.
    Log Name: 
    Application
    Source:   
    Application Error
    Date:     
    4/6/2014 1:57:39 PM
    Event ID: 
    1000
    Task Category: (100)
    Level:    
    Error
    Keywords: 
    Classic
    User:     
    N/A
    Computer: 
    REDACTED
    Description:
    Faulting application name: Photoshop.exe, version: 14.2.1.570, time stamp: 0x52f4a9f2
    Faulting module name: MediaCoreIF.DLL, version: 7.0.0.0, time stamp: 0x51573a21
    Exception code: 0xc0000005
    Fault offset: 0x00000000001fca98
    Faulting process id: 0x1ca4
    Faulting application start time: 0x01cf51c68639ec43
    Faulting application path: C:\Program Files\Adobe\Adobe Photoshop CC (64 Bit)\Photoshop.exe
    Faulting module path: C:\Program Files\Adobe\Adobe Photoshop CC (64 Bit)\MediaCoreIF.DLL
    Report Id: 5268f22b-bdbd-11e3-b9df-d850e65afc6f

  • Crashing when working with cutaways

    Hey all!
    really enjoying the iMovie 11. It's, how do I say, magical
    But there is an annoying issue - when working with cutaways, especially when i have two ore more in a row, once trying to align them so they appear after each other, iMovie always crashes....
    Anyone else have similar issues?
    All updates have been installed....

    Okay that was a no-brainer. Since it was my first time using it, I had to go to Preferences and check "advanced features." Simple enough. I found this out while I was on hold, all on my own. Being put on hold is good for self-resiliency.

  • InDesign CC is performing poorly, slowly when working with text.

    InDesign CC is performing poorly, slowly when working with text. I tried updating and it seems like everything installed except for Extension Manager 6.0.7 which failed to instal. Any fix for this?

    Thanks, I think the problem is solved. Previously, the CC Extension Manager failed to install when I did an update via the InDesign Help menu. InDesign CC updated successfully, however, InDesign was still acting wonky. Since downloading a new full CC Extension Manager a little while ago, InDesign seems to be working better. Is there a correlation between the two apps for future reference?

  • Performance issues when creating a Report / Query in Discoverer

    Hi forum,
    Hope you are can help, it involves a performance issues when creating a Report / Query.
    I have a Discoverer Report that currently takes less than 5 seconds to run. After I add a condition to bring back Batch Status that = ‘Posted’ we cancelled the query after reaching 20 minutes as this is way too long. If I remove the condition the query time goes back to less than 5 seconds.
    Please see attached the SQL Inspector Plan:
    Before Condition
    SELECT STATEMENT
    SORT GROUP BY
    VIEW SYS
    SORT GROUP BY
    NESTED LOOPS OUTER
    NESTED LOOPS OUTER
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS OUTER
    NESTED LOOPS OUTER
    NESTED LOOPS
    NESTED LOOPS OUTER
    NESTED LOOPS OUTER
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS
    TABLE ACCESS BY INDEX ROWID GL.GL_CODE_COMBINATIONS
    AND-EQUAL
    INDEX RANGE SCAN GL.GL_CODE_COMBINATIONS_N2
    INDEX RANGE SCAN GL.GL_CODE_COMBINATIONS_N1
    TABLE ACCESS BY INDEX ROWID APPLSYS.FND_FLEX_VALUES
    INDEX RANGE SCAN APPLSYS.FND_FLEX_VALUES_N1
    TABLE ACCESS BY INDEX ROWID APPLSYS.FND_FLEX_VALUE_SETS
    INDEX UNIQUE SCAN APPLSYS.FND_FLEX_VALUE_SETS_U1
    TABLE ACCESS BY INDEX ROWID APPLSYS.FND_FLEX_VALUES_TL
    INDEX UNIQUE SCAN APPLSYS.FND_FLEX_VALUES_TL_U1
    INDEX RANGE SCAN APPLSYS.FND_FLEX_VALUE_NORM_HIER_U1
    TABLE ACCESS BY INDEX ROWID GL.GL_JE_LINES
    INDEX RANGE SCAN GL.GL_JE_LINES_N1
    INDEX UNIQUE SCAN GL.GL_JE_HEADERS_U1
    INDEX UNIQUE SCAN GL.GL_SETS_OF_BOOKS_U2
    TABLE ACCESS BY INDEX ROWID GL.GL_JE_HEADERS
    INDEX UNIQUE SCAN GL.GL_JE_HEADERS_U1
    INDEX UNIQUE SCAN GL.GL_DAILY_CONVERSION_TYPES_U1
    TABLE ACCESS BY INDEX ROWID GL.GL_JE_SOURCES_TL
    INDEX UNIQUE SCAN GL.GL_JE_SOURCES_TL_U1
    INDEX UNIQUE SCAN GL.GL_JE_CATEGORIES_TL_U1
    INDEX UNIQUE SCAN GL.GL_JE_HEADERS_U1
    INDEX UNIQUE SCAN GL.GL_JE_HEADERS_U1
    INDEX UNIQUE SCAN GL.GL_JE_BATCHES_U1
    INDEX UNIQUE SCAN GL.GL_BUDGET_VERSIONS_U1
    INDEX UNIQUE SCAN GL.GL_ENCUMBRANCE_TYPES_U1
    INDEX UNIQUE SCAN GL.GL_SETS_OF_BOOKS_U2
    TABLE ACCESS BY INDEX ROWID GL.GL_JE_BATCHES
    INDEX UNIQUE SCAN GL.GL_JE_BATCHES_U1
    INDEX UNIQUE SCAN GL.GL_SETS_OF_BOOKS_U2
    INDEX UNIQUE SCAN GL.GL_JE_BATCHES_U1
    TABLE ACCESS BY INDEX ROWID GL.GL_PERIODS
    INDEX RANGE SCAN GL.GL_PERIODS_U1
    After Condition
    SELECT STATEMENT
    SORT GROUP BY
    VIEW SYS
    SORT GROUP BY
    NESTED LOOPS
    NESTED LOOPS OUTER
    NESTED LOOPS OUTER
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS OUTER
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS OUTER
    NESTED LOOPS
    NESTED LOOPS OUTER
    NESTED LOOPS
    NESTED LOOPS
    NESTED LOOPS OUTER
    NESTED LOOPS
    TABLE ACCESS FULL GL.GL_JE_BATCHES
    INDEX UNIQUE SCAN GL.GL_SETS_OF_BOOKS_U2
    INDEX UNIQUE SCAN GL.GL_JE_BATCHES_U1
    TABLE ACCESS BY INDEX ROWID GL.GL_JE_HEADERS
    INDEX RANGE SCAN GL.GL_JE_HEADERS_N1
    INDEX UNIQUE SCAN GL.GL_SETS_OF_BOOKS_U2
    INDEX UNIQUE SCAN GL.GL_ENCUMBRANCE_TYPES_U1
    INDEX UNIQUE SCAN GL.GL_DAILY_CONVERSION_TYPES_U1
    INDEX UNIQUE SCAN GL.GL_BUDGET_VERSIONS_U1
    TABLE ACCESS BY INDEX ROWID GL.GL_JE_SOURCES_TL
    INDEX UNIQUE SCAN GL.GL_JE_SOURCES_TL_U1
    INDEX UNIQUE SCAN GL.GL_JE_CATEGORIES_TL_U1
    INDEX UNIQUE SCAN GL.GL_JE_BATCHES_U1
    TABLE ACCESS BY INDEX ROWID GL.GL_JE_LINES
    INDEX RANGE SCAN GL.GL_JE_LINES_U1
    INDEX UNIQUE SCAN GL.GL_SETS_OF_BOOKS_U2
    TABLE ACCESS BY INDEX ROWID GL.GL_CODE_COMBINATIONS
    INDEX UNIQUE SCAN GL.GL_CODE_COMBINATIONS_U1
    TABLE ACCESS BY INDEX ROWID GL.GL_PERIODS
    INDEX RANGE SCAN GL.GL_PERIODS_U1
    TABLE ACCESS BY INDEX ROWID APPLSYS.FND_FLEX_VALUES
    INDEX RANGE SCAN APPLSYS.FND_FLEX_VALUES_N1
    INDEX RANGE SCAN APPLSYS.FND_FLEX_VALUE_NORM_HIER_U1
    TABLE ACCESS BY INDEX ROWID APPLSYS.FND_FLEX_VALUES_TL
    INDEX UNIQUE SCAN APPLSYS.FND_FLEX_VALUES_TL_U1
    TABLE ACCESS BY INDEX ROWID APPLSYS.FND_FLEX_VALUE_SETS
    INDEX UNIQUE SCAN APPLSYS.FND_FLEX_VALUE_SETS_U1
    INDEX UNIQUE SCAN GL.GL_JE_HEADERS_U1
    INDEX UNIQUE SCAN GL.GL_JE_HEADERS_U1
    INDEX UNIQUE SCAN GL.GL_JE_HEADERS_U1
    Is there anything i can do in Discoverer Desktop / Administration to avoid this problem.
    Many thanks,
    Lance

    Hi Rod,
    I've tried the condition (Batch Status||'' = 'Posted') as you suggested, but the qeury time is still over 20 mins. To test i changed it to (Batch Status||'' = 'Unposted') and the query was returned within seconds again.
    I’ve been doing some more digging and have found the database view that is linked to the Journal Batches folder. See below.
    I think the problem is with the column using DECODE. When querying the column in TOAD the value of ‘P’ is returned. But in discoverer the condition is done on the value ‘Posted’. I’m not too sure how DECODE works, but think this could be the causing some sort of issue with Full Table Scans. How do we get around this?
    Lance
    DECODE( JOURNAL_BATCH1.STATUS,
    '+', 'Unable to validate or create CTA',
    '+*', 'Was unable to validate or create CTA',
    '-','Invalid or inactive rounding differences account in journal entry',
    '-*', 'Modified invalid or inactive rounding differences account in journal entry',
    '<', 'Showing sequence assignment failure',
    '<*', 'Was showing sequence assignment failure',
    '>', 'Showing cutoff rule violation',
    '>*', 'Was showing cutoff rule violation',
    'A', 'Journal batch failed funds reservation',
    'A*', 'Journal batch previously failed funds reservation',
    'AU', 'Showing batch with unopened period',
    'B', 'Showing batch control total violation',
    'B*', 'Was showing batch control total violation',
    'BF', 'Showing batch with frozen or inactive budget',
    'BU', 'Showing batch with unopened budget year',
    'C', 'Showing unopened reporting period',
    'C*', 'Was showing unopened reporting period',
    'D', 'Selected for posting to an unopened period',
    'D*', 'Was selected for posting to an unopened period',
    'E', 'Showing no journal entries for this batch',
    'E*', 'Was showing no journal entries for this batch',
    'EU', 'Showing batch with unopened encumbrance year',
    'F', 'Showing unopened reporting encumbrance year',
    'F*', 'Was showing unopened reporting encumbrance year',
    'G', 'Showing journal entry with invalid or inactive suspense account',
    'G*', 'Was showing journal entry with invalid or inactive suspense account',
    'H', 'Showing encumbrance journal entry with invalid or inactive reserve account',
    'H*', 'Was showing encumbrance journal entry with invalid or inactive reserve account',
    'I', 'In the process of being posted',
    'J', 'Showing journal control total violation',
    'J*', 'Was showing journal control total violation',
    'K', 'Showing unbalanced intercompany journal entry',
    'K*', 'Was showing unbalanced intercompany journal entry',
    'L', 'Showing unbalanced journal entry by account category',
    'L*', 'Was showing unbalanced journal entry by account category',
    'M', 'Showing multiple problems preventing posting of batch',
    'M*', 'Was showing multiple problems preventing posting of batch',
    'N', 'Journal produced error during intercompany balance processing',
    'N*', 'Journal produced error during intercompany balance processing',
    'O', 'Unable to convert amounts into reporting currency',
    'O*', 'Was unable to convert amounts into reporting currency',
    'P', 'Posted',
    'Q', 'Showing untaxed journal entry',
    'Q*', 'Was showing untaxed journal entry',
    'R', 'Showing unbalanced encumbrance entry without reserve account',
    'R*', 'Was showing unbalanced encumbrance entry without reserve account',
    'S', 'Already selected for posting',
    'T', 'Showing invalid period and conversion information for this batch',
    'T*', 'Was showing invalid period and conversion information for this batch',
    'U', 'Unposted',
    'V', 'Journal batch is unapproved',
    'V*', 'Journal batch was unapproved',
    'W', 'Showing an encumbrance journal entry with no encumbrance type',
    'W*', 'Was showing an encumbrance journal entry with no encumbrance type',
    'X', 'Showing an unbalanced journal entry but suspense not allowed',
    'X*', 'Was showing an unbalanced journal entry but suspense not allowed',
    'Z', 'Showing invalid journal entry lines or no journal entry lines',
    'Z*', 'Was showing invalid journal entry lines or no journal entry lines', NULL ),

  • Table contained in a topic causes slow response when working with source

    I am using RoboHelp V8.0.2 (I have applied the two fixes available from the Adobe product site).
    Here is my issue:
    I am documenting a product that has over 1,500 metrics.
    I am trying to make a list of 1,077 metrics in one topic (all the metrics are of the same "type" so they are grouped together).
    The table presenting these metrics has two columns, a Field Name column and a Description column.
    There is not a lot of text within the table - most of the explanations and calculations are defined outside the table.
    When I have more than about 400 table rows, the response time working with the topic in source is just horrible.
    It took 3 minutes to open the topic in source, another 3 minutes to move the table over .425in from margin.
    When I try to close the project, it gets hung up.
    Has anyone else run into a problem working with tables within a topic?
    I had a similar response time issue with V8.0.0 in general when I first installed it.
    Once I installed the two fixes, that problem was (for the most part) eliminated.
    I think the second fix specifically addressed response time issues when working in source.
    I am going to try this on one of my other writer's machine to see if it is the age of my computer.
    My computer setup is:
    Dell Precision work station with an Intel Xeon 2.40GHZ processor and 2MB of RAM.
    I am using Windows XP Professional 5.1.26 Service Pack 3.0.
    I have a 110 GB hard drive with 18 GB available.
    As always - thanks for any insight or information.
    Michael F Weart

    Hi Michael.
    Interestingly the maximum number of rows you can use when you create a table via the Table > Insert > Table menu item is 100. That said, I've just created a 500 row table and whilst it is slower than a smaller table it is OK. I am also using RH 8.0.2 but on a higher spec PC.
    Read the RoboColum(n) for a tips, tricks and musings on the Technical Communication Suite products.
    Follow the RoboColum(n) on Twitter

  • Disable updating of the menu widget when working with Muse file?

    First of all thank you Adobe for creating Muse - I really like it a lot!
    My question:
    when I work on my website with 250+ pages, it strains the performance to make even the slightest changes to the page structure. I work with two Masters each with much the same horisontal menu widget.
    Adding, deleting and rearranging pages makes the PC (Windows 7 laptop of some but not awesome power) make a halt and take a deep breathe for some seconds. It really interupts the flow of working.
    A crude work around for this is to removing the menu widget in the two masters and put it into a new master page, containing only the copied menu from the two other masters and keeping the rest of the header in place on the original masters to use for aligning on the pages. This way, only one page has to update when working with adding, deleting and rearranging pages.
    I have tried to look for a preference to disable updating the Menu widget when working on the page structures and just enable it the update again when the work is done? does it exist? else I would recommend it strongly :-) Since my work around is a bit tricky/risky to use after publishing ;-)
    Again, thanks a lot for this nice program. Being a selftought CS (now CCC) user I like the ease of control of the webdesign you can get with Muse (at the dispence of other features of course - if you ever wanted to add some sort of database-lookup feature it would be MUCH appreciated. One workaround is to use iframe and an external search/lookup page, that returns results with links back to parent pages. A bit funny construction though :-) ).
    BR
    M. Hecquet

    Glad to know you are enjoying Muse.
    It can be very processor intensive if you have multiple Master pages with their own All Pages menu widget. Take a look at the following threads and refer to Zak's response.
    http://forums.adobe.com/thread/1423767
    http://forums.adobe.com/message/6166722
    See if you are able to use just one All Pages menu widget at your site while taking advantage of the hierarchical Master page feature.
    http://tv.adobe.com/watch/muse-feature-tour/adobe-muse-hierarchical-master-pages/
    We are aware of the performance issues caused by the Menu widget (All pages type) and is something being looked upon by the engineering in one of the upcoming releases.
    Thanks,
    Vinayak

  • Performance issue in correlation with hidden objects in reports

    Hello,
    is there a known performance issue in correlation with hidden objects in reports using conditional suppressions? (HFM version 11.1.2.1)
    Using comprehensive reports, we have huge performance differences between the same reports with and without hidden objects. Furthermore we suspect that some trouble with our reporting server environment base on using these reports through enduser.
    Every advice would be welcome!
    Regards,
    bsc
    Edited by: 972676 on Nov 22, 2012 11:27 AM

    If you said that working with EVDRE for each separate sheet is fin ethat's means the main problem it is related to your VB custom macro interdependecy.
    I suggest to add a log (to write into a text file)for you Macro and you will se that actually that minute is staying to perform operations from custom Macro.
    Kind Regards
    Sorin Radulescu

  • Performance Issues when editing large PDFs

    We are using Adobe 9 and X Professional and are experiencing performance issues when attempting to edit large PDF files.  (Windows 7 OS). When editing PDFs that are 200+ pages, we are seeing pregnated pauses (that feel like lockups), slow open times and slow to print issues. 
    Are there any tips or tricks with regard to working with these large documents that would improve performance?

    You said "edit." If you are talking about actual editing, that should be done in the original and a new PDF created. Acrobat is not a very good editing tool and should only be used for minor, critical edits.
    If you are talking about simply using the PDF, a lot depends on the structure of the PDF. If it is full of graphics, it will be slow. You can improve this performance by using the PDF Optimize to reduce graphic resolution and such. You may very likely have a bloated PDF that is causing the problem and optimizing the structure should help.
    Be sure to work on a copy.

  • Oracle Retail 13 - Performance issues when open, save, approving worksheets

    Hi Guys,
    Recently we started facing performance issues when we started working with Oracle Retail 13 worksheets from within the java GUI at clients desktops.
    We run Oracle Retail 13.1 powered by Oracle Database 11g R1 and AS 10g in latest release.
    Issues:
    - Opening, saving, approving worksheets with approx 9 thousands of items takes up to 15 minutes.
    - Time for smaller worksheets is also around 10 minutes just to open a worksheet
    - Also just to open multiple worksheets takes "ages" up to 10-15 minuts
    Questions:
    - Is it expected performance for such worksheets?
    - What is your experience with Oracle Retail 13 in terms of performance while working with worksheets - how much time does it normally take to open edit save a worksheet?
    - What are the average expected times for such operations?
    Any feedback and hints would be much appreciated.
    Cheers!!

    Hi,
    I guess you mean Order/Buyer worksheets?
    This is not normal, should be quicker, matter of seconds to at most a minute.
    Database side tuning is where I would look for clues.
    And the obvious question: remember any changes to anything that may have caused the issue? Are the table and index statistics freshly gathered?
    Best regards, Erik Ykema

Maybe you are looking for

  • External processing in the plant as vendor under same company code

    Hi friends, In our project under implementation we have following scenario. we have 4 plants under one company code. one of the plant does powder coating as manufacturing activity. Other 3 plants sends components to the powder coating plant for coati

  • Erron while using DynamicConfiguration  in UDF

    Hi All, I have create an UDF ,which imports com.sap.aii.mapping.api.*; package  adn performs the following: /* UDF */ DynamicConfiguration conf = (DynamicConfiguration) container.getTransformationParameters().get(StreamTransformationConstants.DYNAMIC

  • OIM 10G : UIX Not found error

    Hello, I've moved OIM 10G 9102 to application server Weblogic in cluster (version 10.3.5.0 - 11g). All the workflows and provisioning pieces are working as expected. However, when a manager user is trying to set his proxy by selecting calender dates.

  • 6120 theme problem ?

    Hello, first post i ve got a problem the applications icon on the main menu when i add a theme it want change i have the nokia 6120c i m posting an sc same happens with other themes too does anyone has the same problem ? i tried to update it with nok

  • JLabel updating text

    Hi, I have a panel called BottomPanel that is used in three frames. The same instance of BottomPanel is always used. I want the BottomPanel to do the following, every 10 seconds to change the text of a JLabel to "current time is XX". The issue I have