Issue with Consolidate Action through Batch Process

We find that when "Load" reports an error, "Consolidate" does not fire - only from Batch. When processing manually, Consolidate fires regardless of Load errors.
We are using an Essbase load rule that results in records continuing to be loaded, when there are kickouts. So, the Essbase API is returning an error code to FDM [that's fine].
We want to run a calc script whether or not the load errors.
Again, with a "manual" process - no problem. But Batch stops prior to firing Consolidate.
We checked the Batch flag - we set it to "10" (Consolidate) and even "12" (Check).
Any thoughts?
We are trying to avoid altering the Load Action code to execute the Essbase calc within that code set.
Any inputs regarding this issue would be greatly appreciated.
Thanks

Thanks for responding back. I have seen the link before.
We have HR_PL_ADMINISTRATOR_000 role. The role has P_ASRCONT P_ORGIN and P_PERNR object. Object  P_ORGIN needs to be set as
Authorization level            Read
Infotype                           *
Personnel Area                *
Employee Group              *
Employee Subgroup         *
Subtype                           *
Organizational Key            *
in order to show the Hire Process in execute hiring link on the Portal. If we put a restriction on personal area, we do not see the Hire process on the execute hiring on the portal.
Kindly advice.
Thanks,
Gowri

Similar Messages

  • Authorization issue on hire action through HCM Process and Forms

    Hi All,
    We are executing hiring action through HCM form process. The process is using the  HR_PL_ADMINISTRATOR role on ECC which is super admin access to execute the action and HR administrator role on the portal. If we restrict the role with personal area, we do not see the hire process on the portal.
    Could you please let me know if anyone has faced this issue?
    Thanks,
    Gowri

    Thanks for responding back. I have seen the link before.
    We have HR_PL_ADMINISTRATOR_000 role. The role has P_ASRCONT P_ORGIN and P_PERNR object. Object  P_ORGIN needs to be set as
    Authorization level            Read
    Infotype                           *
    Personnel Area                *
    Employee Group              *
    Employee Subgroup         *
    Subtype                           *
    Organizational Key            *
    in order to show the Hire Process in execute hiring link on the Portal. If we put a restriction on personal area, we do not see the Hire process on the execute hiring on the portal.
    Kindly advice.
    Thanks,
    Gowri

  • Using actions and batch processing to maximize compatibility on old psd files

    My apologies if this has been answered -- in my search I saw that it was asked multiple times, but not answered.
    My goal is to be able to import 5 years of PSD files into Lightroom that were originally saved with maximize compatibility turned off. These files are in shoot folders within year folders.
    I have been trying to do this with an action and batch processing through Bridge (Tools>PS>Batch). After switching preferences to MC=always, I have tried an action with a Save command, but that doesn't switch the file to max compat on. Save As does, but it also fixes the destination of the save in the action to a specific folder. I want to be able to overwrite the existing PSD file, leaving it where it is.
    Has anyone successfully tackled this issue? I know I can handle it folder by folder, with an action for each folder, but that just wouldn't be cool.
    Thank you

    L Show - Yes, I found a way and did just that.
    Everything as you wrote, using the save-as option. Yes, it will put the folder name into the action BUT it will override it when you run your batch from bridge IF you check in the box "override action save-as command" (and destination: save and close).
    Works like a charm (although slow - not utilizing the multiple cores :( )

  • Issue with folder action created with automator

    Hello all,
    I'm having an issue with folder actions that i created with automator and it's driving me crazy. Apple says they can't help with things created by the user so this is my only shot at figuring this out.
    Running a Mini 2.3G quad i7, 8GB ram on OSX Mavericks 10.9.2
    My goal is to set up an action to add text to the filename of a file placed into a particular folder. So I open automator and choose folder action. Then I choose the folder I want the action applied to at the top. Then I go to Files and Folders in the library and choose rename finder items and drag it over to the workflow. I select add text, then add the text I want added to the file name and choose before name because I want the added text to preceede the original filename text. I save it then go to the folder I applied it to and right clikc go to services then folder actions and verify that the service is attached to that particular folder and it is.
    Should work right? No. When I place a file in that folder the action runs and adds the text like I want it to... but then it starts to add it over and over again in an infinite loop. It also adds a file with the extension .ds_store to the folder that also has the name added over and over.
    I've tried deleteing all the folder actions and even deleteing all the folder actions themselves saved by the automator in the workflow file in the user library.
    I'm stumped... no idea what to do. A while ago I had used the automator to batch rename files and it worked perfectly. I tried the same steps i listed above to set this action up on a machine in an Apple retail store and it worked... so I know I have the set up right.
    Any help is greatly appreciated.
    Thanks,
    Justin

    Hi JK257
    This is why it happens:
    The folder action is looking for new items in the folder. You drop in a file called "File1". So it renames it "sometext File1". Then it sees this new file called "sometext File1" and thinks "Hey, this is a new file - I'll call it 'sometext sometext File1' because I have to rename every new file." Then it sees this new file called "sometext sometext File1" and thinks...
    You get the picture.
    (the .ds_store is a normally hidden database file which is getting revealed by the same process.)
    The solution is to move the files out of the renamer folder into a receiving folder after they have been renamed:
    Hope this helps,
    H

  • Issue with ABAP program execution from process chains

    Hi All:
    We have a process chain with 3 steps, each of them executing the same program with three different variants. The program is ftp's the file from APO's dataexchange (mount) to another ftp server. The first variant transfers file A to a directory in the external ftp server (say /X) . The second and the third variants are supposed to transfer different files, B and C to the same directory.
    That is where the problem is. After the process chain is successful, I see two files B and C but the contents are same and that of C. So, if I switch the steps in the PC to bring in A then C and then B, I see files B and C with content of C. I tried C then A then B. I see the file names correct but now the contents are A, A then B.
    Have any of you come across this issue? Do you know that these is an existing problem? IF you have a solution, pl. let me know.
    Thanks
    Narayanan

    Narayanan,
    Instead of doing it in three steps - would it be possible for you to have one UNIX script or equivalent doing the above and calling the same from your process chain ...?
    We do a lot of FTPs but then our file names are standard and we have a UNIX script for the same executed using a system command through a process chain and it has been working without issues for the past 1 year ...
    Maybe I have not got your situation properly ... some more detail on the program details and what you are doing in more detail would help....  also SP levels please..

  • Image Sequences and Batch Processing: Can opening files from different folders as image sequences be set up as an action for batch processing?

    I have an ongoing series of tasks that necessitates opening the contents of a series of folders as image sequences.
    I'm used to setting up some fairly complex actions, including ones that are intended to be applied to all of the contents of materials in various subfolders, but for some reason I cannot get an Open-with-Image-Sequence-Checked action to set up in a way that will batch process correctly.
    Complicating matters is that the file prefixes (i.e., the letters before the sequential numbers) and the folders will always have different names from the last time the process had to be carried out.
    The steps I would like the action to carry out would be the following:
    1. Open Subfolder 1 - that is, Photoshop shouldn't be looking for a specific folder. It's just *any* subfolder to the current parent folder.
    2. Select first file in sequenced set of files.
    3. Open as an image sequence.
    4. Open Subfolder 2.
    5. Select first file in sequenced set of files.
    6. Open as an image sequence.
    ...rinse and repeat until Photoshop runs out of subfolders to check.
    Am I asking the impossible?
    Thanks!

    I'll look around there,  because I had posted something about this and was told to look here ps-scripts - Browse /Image Processor Pro/v3_0 at SourceForge.net  
    then this is the image here:
    is the result I received.  Granted I'm new to using scripts and actions for image processing large amounts of files.  

  • Issue with filtering KPIs through perspectives in a Tabular model

    I am having issues with trying to filter KPIs through a perspective.  In my fact table, I have three KPIs created out of SumOf measures.  There are three different user groups, and one user group wants a KPI specifically for their area, so it should
    not show up on the other perspectives.
    I remove the SumOf column from the list of fields under the Perspectives menu (there is not an option for a KPI...just Sum of [Column Name]).  When I select that perspective through my model view, it does not show the KPI under my fact table, which
    is what I would expect.
    When I try to analyze in Excel, or deploy to the server and connect to that perspective through a pivot table, at times all KPIs will show up (with my Sum of ... column removed) and other times, zero KPIs appear.  When I connect to the default perspective,
    however, everything appears.
    Has anyone ran into this issue before?  I have tried a process recalc on my cube through SSDB (I'm not sure if I did it right) and I tried a recalc through the Process menu to no avail.  Any help will be greatly appreciated because I am stumped
    at the moment.

    I've had problems with KPIs in perspectives (i.e. not showing up in the correct perspective). Unfortunately, I have not been able to find a resolution. At the time I experienced the issue, I tried googling and looking at the forums, but didn't
    come up with anything.

  • Issue with Dynamic Actions - Infotype 0041 Updation

    Hi All,
    Requirement :
    I have to update the Date type-13 ( Resignation Date - DAR11 ) without Date ( DAT11 as Blank value - User will fill through Infogroup ) while performing the Termination action through PA40.
    Added below code in V_T588z (0000):
    I have added 186,188 Lines.Other lines are standard thing.
                  04     180     P     PSPAR-MASSN='10'/X  **Termination
                  04     181     P     PSPAR-MASSN='ZB'/X **Termination
                  04     182     P     PSPAR-MASSN='14'/X  **Termination
                  04     185     I     COP,41,,,(PSPAR-BEGDA),(PSPAR-ENDDA)/D
                  04     186     W     P0041-DAR11='13' **BEGIN & END OF CRA01670 CODE ***
                  04     188     W     P0041-DAR11='' **BEGIN & END OF CRA01670 CODE ***
    Issue :
    1. The above logic is not working for Finland ,other countries. For only India it is working.
    2. Some times Date type-13 is coming twice while through the Infogroup-0041 screen twice,which is incorrect. The infotype-0041 should call only once.I have checked the User parameters,MOLGA,USR. But it is not working.
    2. The standard configuration Date type -09-Last day of work - is working fine . But i could not be able to find ,from where it is coming.I have checked the Dynamic actions,but no use.
    Need your inputs to solve the issue.
    Thanks,
    N.L.Narayana

    P     P0000-MASSN='01'    Action type
    P                   P0000-MASSG="01"   Action Reaosn
    P     T001P-MOLGA='40'
    F                   here u will keep the Call Routine
    I     INS,2001,,,(P0000-BEGDA),(P0000-ENDDA)/D  (here u have pass the date of Call routine )
    W     P2001-IAWRT='01'
    But here Infotype 2001 we will give Absence type this IT is used for Booking Absence Types i did not get the Wage type 8001
    thru which IT are u processing it
    for more info Checjk Tabvke V_T588Z  here iam pasteing the sample call routine which i got from earlier thread
    Create a Program (Subroutine) using SE38. Save and activate it.
    ZZSY_DYA_TEST2
    REPORT ZZSY_DYA_TEST2.
    tables : p0000,p0001,rp50d.
    FORM getdate.
    rp50d-date1 = p0000-begda + 90.
    ENDFORM.
    FORM get_date.
    rp50d-date2 = p0000-begda + 180.
    ENDFORM.

  • MGE - issues with pers. actions for delegation in ECC6.0 - management of global employees

    Hi experts,
    during implementation of MGE at my client we are facing issues with the personnel actions for delegation.
    We were using the standard actions as follows, without success.
    action Expatriation planning (81), IG 92 via PA40 --> new persno. has been created successfully, empl. status has been set to withdrawn, IT715 "host" has been set to "planned", IT710 has been filled (required fields only, no admin, no manager, no sending pers. assignment info), all payroll relevant ITs have been skipped (706 + 707)
    IT715 has been set to "to be activated" via PA30
    running of report RPMGE_Activation for newly created persno.(assignment nr), pers. action 82 Activation in host country, IG GE --> following ISSUES occure:
    Action can not be executed via PA40 (is this normal behaviour?)
    when running with report, employm. status changed to active
    IT715 "host" has not been changed updated in status (should have been updated to "in progress", right?)
    IT715 "home" has not been created (correct?)
    IT710 has not changed
    Background information:
    > Switch MAINS is activated
    > We are not going to use global payroll! No switch has been activated. Payroll will always run in home country, so the client does not want to use any of the py relevant ITs dealing with global employment.
    Many thanks + best wishes,
    Evelyn

    Hi experts,
    short update on our issue:
    After several times of trying the same thing, the report worked out fine as it should for 82 and also in the next step for 83. IT715 is going to be updated correctly. We need to work with the report and not with PA40.
    Some open questions remain:
    - Set home assignment to active during delegation (not to inactive for payroll reasons in home country)
    Documentation for RPMGE_Activation says
    Home Activation (Active) (84)
    Same process as described above, excpet that in the third step, the Employment Status field in infotype Action (0000) is set to Active if there are any trailing payments in the home country.
    We understand that 84 is identical to 83 but employment status is different.
    For action 84 a different IG is underlying (93) than for 83 (GE) and also it is called "Change expat.planning". The Action does not seem to work with the report as well.
    >> Is this correct? Do we need to create our own Home Activation (Active) Action by coping 83, changing the employment status setting?
    How do we need to configure Feature ACTCE then? EXPATACTHOME would need to have 2 entires at the end, no?
    - End of delegation
    Which actions are used for end of delegation? Are there standard once that we could not find?
    We would like to set the host country to withdrawn and the home country back to active (also change of employee group in this case).
    Also using the logic of IT715, is there any automatic way of changing the host country record to status completed, or would we use an exit action, including IT715 and set status manually?
    Same question for home country.
    Many Thanks,
    Evelyn

  • Performance issues with dynamic action (PL/SQL)

    Hi!
    I'm having perfomance issues with a dynamic action that is triggered on a button click.
    I have 5 drop down lists to select columns which the users want to filter, 5 drop down lists to select an operation and 5 boxes to input values.
    After that, there is a filter button that just submits the page based on the selected filters.
    This part works fine, the data is filtered almost instantaneously.
    After this, I have 3 column selectors and 3 boxes where users put values they wish to update the filtered rows to,
    There is an update button that calls the dynamic action (procedure that is written below).
    It should be straight out, the only performance issue could be the decode section, because I need to cover cases when user wants to set a value to null (@) and when he doesn't want update 3 columns, but less (he leaves '').
    Hence P99_X_UC1 || ' = decode('  || P99_X_UV1 ||','''','|| P99_X_UC1  ||',''@'',null,'|| P99_X_UV1  ||')
    However when I finally click the update button, my browser freezes and nothing happens on the table.
    Can anyone help me solve this and improve the speed of the update?
    Regards,
    Ivan
    P.S. The code for the procedure is below:
    create or replace
    PROCEDURE DWP.PROC_UPD
    (P99_X_UC1 in VARCHAR2,
    P99_X_UV1 in VARCHAR2,
    P99_X_UC2 in VARCHAR2,
    P99_X_UV2 in VARCHAR2,
    P99_X_UC3 in VARCHAR2,
    P99_X_UV3 in VARCHAR2,
    P99_X_COL in VARCHAR2,
    P99_X_O in VARCHAR2,
    P99_X_V in VARCHAR2,
    P99_X_COL2 in VARCHAR2,
    P99_X_O2 in VARCHAR2,
    P99_X_V2 in VARCHAR2,
    P99_X_COL3 in VARCHAR2,
    P99_X_O3 in VARCHAR2,
    P99_X_V3 in VARCHAR2,
    P99_X_COL4 in VARCHAR2,
    P99_X_O4 in VARCHAR2,
    P99_X_V4 in VARCHAR2,
    P99_X_COL5 in VARCHAR2,
    P99_X_O5 in VARCHAR2,
    P99_X_V5 in VARCHAR2,
    P99_X_CD in VARCHAR2,
    P99_X_VD in VARCHAR2
    ) IS
    l_sql_stmt varchar2(32600);
    p_table_name varchar2(30) := 'DWP.IZV_SLOG_DET'; 
    BEGIN
    l_sql_stmt := 'update ' || p_table_name || ' set '
    || P99_X_UC1 || ' = decode('  || P99_X_UV1 ||','''','|| P99_X_UC1  ||',''@'',null,'|| P99_X_UV1  ||'),'
    || P99_X_UC2 || ' = decode('  || P99_X_UV2 ||','''','|| P99_X_UC2  ||',''@'',null,'|| P99_X_UV2  ||'),'
    || P99_X_UC3 || ' = decode('  || P99_X_UV3 ||','''','|| P99_X_UC3  ||',''@'',null,'|| P99_X_UV3  ||') where '||
    P99_X_COL  ||' '|| P99_X_O  ||' ' || P99_X_V  || ' and ' ||
    P99_X_COL2 ||' '|| P99_X_O2 ||' ' || P99_X_V2 || ' and ' ||
    P99_X_COL3 ||' '|| P99_X_O3 ||' ' || P99_X_V3 || ' and ' ||
    P99_X_COL4 ||' '|| P99_X_O4 ||' ' || P99_X_V4 || ' and ' ||
    P99_X_COL5 ||' '|| P99_X_O5 ||' ' || P99_X_V5 || ' and ' ||
    P99_X_CD   ||       ' = '         || P99_X_VD ;
    --dbms_output.put_line(l_sql_stmt); 
    EXECUTE IMMEDIATE l_sql_stmt;
    END;

    Hi Ivan,
    I do not think that the decode is performance relevant. Maybe the update hangs because some other transaction has uncommitted changes to one of the affected rows or the where clause is not selective enough and needs to update a huge amount of records.
    Besides that - and I might be wrong, because I only know some part of your app - the code here looks like you have a huge sql injection vulnerability here. Maybe you should consider re-writing your logic in static sql. If that is not possible, you should make sure that the user input only contains allowed values, e.g. by white-listing P99_X_On (i.e. make sure they only contain known values like '=', '<', ...), and by using dbms_assert.enquote_name/enquote_literal on the other P99_X_nnn parameters.
    Regards,
    Christian

  • Override action "save as" command with save for web batch processing

    Hi everyone,
    I've created an action that uses the save for web dialogue to optimise images for the web. When I use the batch command to process a full folder of images, even though I have "Override action "save as" command" checked, it ignores this and still uses the location that was used when the action was created.
    Apparently this is a known issue in CS6 and previous versions but I wondered if anyone knows whether this has been fixed in Photoshop CC?
    Appreciate any advice.
    Thanks

    Since save for web is an export plugin, none of the destination options have any effect on where the files are saved
    or the file names.
    Save for web only saves the files to the folder specified when you recorded the action.
    (it's always been that way and still is in photoshop cc)
    You might look at the Image Processor Pro
    (included in Dr. Brown’s Services 2.3.1)
    (has a save for web option)
    http://www.russellbrown.com/scripts.html

  • FF_5 - Issue with Account Balance option while processing BAI file

    Hi All,
    We are getting a runtime error while trying to process the same bank file again through transaction FF_5 with 'Account balance' option checked. And the message says 'The ABAP/4 Open SQL array insert results in duplicate database records.'  with the command 'Insert_FEBPI'. This happens only when we have combination of uploaded and not uploaded statements in the bank file.
    I could find a OSS note for the same issue but that was for the file format MT942. We are using BAI file format. Can some one please help me in this regard.
    Thanks in advance!
    Regards,
    Jalendhar

    This is standard SAP functionality. If there are no applicable notes, then you should open an OSS message.
    Rob

  • Issue with vlc steaming through dolphin ftp client

    I am not sure if i should post this here or in the multimedia section excuse me if i am mistaken. I have been using archlinux in almost 4 weeks with gnome 3. I have installed kde today after removing gnome(pacman -Rscn gnome). I have a small home server running ftp server where i have some media files(.mkv). In gnome i usually used nautilus to access the ftp folder(sftp://servername.dyndns.org/srv/) and run vlc to stream the medias. Tried to do the same with kde dolphin with no luck.
    First i was not able to view the folder with dolphin. I researched abd found this thread with a solution:
    http://forum.kde.org/viewtopic.php?f=18 … 5&start=15
    Now i am able to access the folder, but can't stream any media using vlc.
    The following errors are from vlc:
    Your input can't be opened:
    VLC is unable to open the MRL.
    sftp://servername.dyndns.org/srv/filename.mkv
    Check the log for details.
    Going into tools->messages gives the followings:
    main error: open of
    sftp://servername.dyndns.org/srv/filename.mkv failed:(null)
    I suspect something is wrong the ftp client configuration in dolphin. This bug report also bothers me also :
    http://old.nabble.com/-Bug-274170--New% … 05158.html
    Other thing is that i am able to open the media files using other media players(dragon player, mplayer), but then the files are automatically downloaded into the local folder /var/tmp/kdecache-username/krun/filename.mkv. The media player then loads the file from there. Its the same if i open a text file in the server, edit it and save it. Its first downloaded to the /var/ folder then asked to upload back to the server when closed.
    Its my conclusion that the ftp client in dolphin does not work properly or KDE KIO works different then whatever is used in gnome/nautilus. I hope i have explained the issue as good as possible and would appreciate any help or lead to how to solve this.

    Thanks for your reply, after checking some ideas I found that to get the ftp to work I needed to use the internal IP 10.0.0.1 for it rather than the old normal IP like before the firewall(probably a very beginner error sorry for that). And I discovered the exact issue causing clients to not see their characters if not using Hamachi.
    Hamachi treats everyone join to the network as though they are local I believe, so when the server send character info via Hamachi it thinks it is sending the info locally and then Hamachi itself sends it out to the extrenal client, while tracing the data I found that the login process sends a 33 byte packet of data to the client via TCP from port 5051 out to the client on a rnadom port usually in the range of 40000-50000 telling the client to send a request to the other process to ask for the character information.
    Now for some reason I see when a client logs in through Hamachi that packet is sent correctly to their Hamachi IP and recived fine.(sent from 10.0.0.1:5051 to the Hamachi IP:40k-50k) but when a client tries to log in using a normal IP with Hamachi turned off, the log in process does send the 33 byte packet from 10.0.0.1:5051 to the client WAN IP through the usual port but the client never recieves this packet and as such doesn't not request the character information.
    So my guess is something on the 5505 is disallowing the log in process to send the data externally to the clients WAN IP's? Though this is very odd because it does allow the client to actually log in to the account and seems to recieve at least part of that information fine.
    If any help that might resolve this for me can be given I would very much appreciate it, this issue is limiting my client base and as such my income and business as a whole. Thank you in advance for any help given.

  • Issues with AUTO cycling through ....

    I'm trying to do this:
    Any help with one or the other is very much appriciated !!!
    1) When the Timer is finish auto cycling through the tabs (1 to 16) of the ViewStack, and switching over to tab (1) to STOP, I would like to address a function to do something ???
    The question is now how to write the code to ID that the Timer has come to a STOP on tab (1), and how can I incoperate this into the existing (onTimerOne) function.
    2) The second item I'm after is that if I'm amnualy select any tab (1 to 16) to address also a function to do something ???
    3) The third item I'm after is to automaticly zero (0) the ViewStack to tab number (1) if I click a Btn.
    <mx:Script>
    <![CDATA[
        import flash.events.TimerEvent;
        import flash.utils.Timer;
        private var timerOne:Timer;
        private function initOne():void {
            timerOne = new Timer(5000, myViewStack.numChildren);
            timerOne.addEventListener(TimerEvent.TIMER, onTimerOne);
        private function onTimerOne(evt:TimerEvent):void {
            if(myViewStack.selectedIndex == myViewStack.numChildren-1) {
            myViewStack.selectedIndex = 0;
            return;
            myViewStack.selectedIndex++;
        private function autoOne():void {
            if (!timerOne.running) {
            timerOne.start();
        private function manualOne():void {
            if (timerOne.running) {
            timerOne.stop();
    ]]>
    </mx:Script>
    4) Well, the fourth item I'm trying to work out is as I'm reading my data from an Xml file to have a TextArea which shows the different countries from the Xml file for each ViewStack tab while auto cycling through these tabs (1 to 16).
    The diffuculty here is that I use this Xml with a specific urlID="1" to urlId="16" as part shown below.
    <urlsOceania>
        <urlOceania urlID="1"/>
        <searchCountry>American Samoa</searchCountry>
        <etc></etc>
    </urlsOceania>
    I'm reading all the other items this way:
    source="{urlsOceania.urlOceania.(@urID==1).etc}"
    Thanks in advance aktell2007

    Thanks for the confirmation.  7 miles away is most likely using the same VZW tower but it does confirm the problem is not in your current location for us.
    You can look up local tower locations from many public websites such as the following:
    www.antennasearch.com
    www.cellreception.com
    http://www.evdoinfo.com/content/view/2990/63/
    The signal of -65 shows you have strong reception but it doesnt show the entire picture.  Your tower could be overloaded or unauthenticating you.  There are lots of little issues that exist outside of the raw signal strength between the towers and the connecting devices that we users have no control over.  As you may guess only a tower tech has access to identify and correct these things.
    Based on the picture of the back of the MBR1515/Netgear N300 router from Netgear I would assume that only a normal sized SIM card will fit.  I would not assume a micro SIM card will fit.  Since I do not have access to either of the VZW or non-VZW 4G LTE router I cannot confirm if it will work or not.  You might have to give Netgear a call and ask.  Based on what I can see from the User Guides of both devices the SIMs used for each should be compatible with eachother.
    If you decide to purchase the non vzw version please post back your findings for us.

  • Issue with G_FXX  while creating a Process

    Hello Friends,
    I have a Manual Tabular form made with APEX_ITEM, I am trying to create a process to perform DML operations based on the values from the G_FXX arrays of APEX_APPLICATION.
    Below is the error I am getting
    ORA-06550: line 9, column 30: PLS-00302: component 'G_F51' must be declared ORA-06550: line 9, column 1: PL/SQL: Statement ignoredCorresponding Line of the Insert Process:
    for i in 1..apex_application.g_f51.count
    loop
        if ( apex_application.g_f51(i) is not null) then
            update CP_INTERFACE_SCH_INFO
         set INTF_NAME = apex_application.g_f52(i),
           Corresponding Region code :
    select apex_item.hidden(50,CAP_PLAN_ID) ||
    apex_item.hidden(51,LINE_NO) ||
    apex_item.text(52,INTF_NAME) INTF_NAME,
    apex_item.select_list_from_query(53,TYPE,'SELECT TYPE,TYPE_ID FROM
    and so onThe form is generated properly and I could check the labels showing up correctly.
    I have multiple other regions and processes on the same page with similar syntax working with out issues. I have spent quite some time, but could not think of any problem...
    Please help me ...

    Hi Dir,
    Thanks for responding.
    In my case :- I have multiple regions, all are tabular forms made with APEX_ITEM each having 7 ~ 8 cols. So, I have run out of arrays. I was updating the collections on specific region SAVE buttons.[I created collections On Load-Before Header with APEX_COLLECTION.CREATE_TRUNCATE_COLLECTION]. I wanted to load actual tables from collections upon hitting the final SUBMIT button.
    Please let me know if there is a better practice to handle this situation.
    Thanks..

Maybe you are looking for

  • ALV to Download file column contents ***

    Hi, I have the Sales unit field( VBAP-VRKME) shown in an ALV as a column. Though the unit displays in the ALV correctly, the download file from Application toolbar to unconverted format shows the contents of the column as *****.   The Sales unit fiel

  • Wrong tax condition base value

    Hi, We are getting wrong tax condition base value if an item having particular combination of inactive condition exclusion items- we have x abd y condition types which are inactive. if x is having 0.01 then system is considering y value for tax condi

  • Showing % ,Name ,Value in Pie Chart

    Hi Experts, I am running into an issue I built a pie chart i want to show Data Labels with (Value,Name,Percentage of Total) but in properties we will get only Name and Value or Percentage of total and name. !http://img268.imageshack.us/img268/5968/pi

  • Non mi si aprono delle applicazioni incluse su mac pro

    non si aprono le Applicazioni incluse su mac pro retina acquistato da una settimana. Seleziono l'app, non si apre ed inizia a saltellare. Grazie

  • My iPhone 3Gs would not connect to Wi-Fi

    Hello, My iPhone 3Gs will not connect to Wi-Fi, it results in a message that states, "unable to join the network". The IP address, Subnet Mask, Router, and DNS is blank- which I am guessing the router do not give my iPhone a valid address anymore. Wh