Data Dump and Exceptions

After reading docs on Data Pump Open call (dbms_datapump.open), it looks like the exception JOB_EXISTS is a predefined but I can't get my code to recognize it. Do I need to declare something? Any ideas?
declare
my_handle number;
my_db_link varchar2(500) := 'DWREDEVLINK' ;
my_job_name varchar2 (500) := 'LISTING_21';
begin
begin
my_handle := dbms_datapump.open (operation => 'IMPORT',
job_mode => 'TABLE',
remote_link => my_db_link,
job_name => my_job_name ,
version=>'LATEST' ) ;
exception
when JOB_EXISTS then
my_handle := dbms_datapump.attach(job_name => 'LISTING_21');
end;
end;
ERROR at line 16:
ORA-06550: line 16, column 8:
PLS-00201: identifier 'JOB_EXISTS' must be declared
ORA-06550: line 9, column 1:

select dump(column_name) from table_name;

Similar Messages

  • Script cannot find perl Data:Dumper module

    Good day. I live in Australia and use a script called Shepherd to update my mythtv program listings.
    After upgrading a couple of days ago (first time in about 6 months) Shepherd has stopped working.
    Shepherd doesn't seem to find the perl data-dumper module, which is part of the base perl install.
    All my packages are up to date as of this post, and I am on the current kernel -
    uname -r
    3.4.9-1-ARCH
    ~/.shepherd/shepherd --check
    ERROR:
    Mandatory module 'Data::Dumper' not found.
    Please see the Wiki at <a href='http://svn.whuffy.com/wiki/Installation' class='bbc_url' title='External link' rel='nofollow external'>http://svn.whuffy.co...ki/Installation</a>
    for details on how to install this module.
    Shepherd is updating ~/.shepherd/output.xmltv (the listings information scraped from the net) but evidently needs the dumper module to post the data into MythTV.
    I'm not sure if it has something to do with this, which I don't properly understand -
    The current Arch Linux default perl installation installs updates to core modules into the perl core directories, creating file conflicts.  Examples include modules such as Data::Dumper and version.
    which is mentioned here.
    I appear to have it installed as part of the perl base package -
    sudo pacman -Qi perl
    Name : perl
    Version : 5.16.1-1
    URL : <a href='http://www.perl.org' class='bbc_url' title='External link' rel='nofollow external'>http://www.perl.org</a>
    Licenses : GPL PerlArtistic
    Groups : base
    Provides : perl-archive-extract=0.58 perl-archive-tar=1.82 perl-attribute-handlers=0.93 perl-autodie=2.10
    perl-autoloader=5.72 perl-autouse=1.07 perl-b-debug=1.17 perl-b-deparse=1.14 perl-b-lint=1.14
    perl-base=2.18 perl-bignum=0.29 perl-carp=1.26 perl-cgi=3.59 perl-compress-raw-bzip2=2.048
    perl-compress-raw-zlib=2.048 perl-constant=1.23 perl-cpan=1.9800 perl-cpan-meta=2.120630
    perl-cpan-meta-yaml=0.007 perl-cpanplus=0.9121 perl-cpanplus-dist-build=0.62 perl-data-dumper=2.135
    So it seems Shepherd is suddenly not finding it for some reason, after the upgrade.
    I tried installing the perl-data-dumper packages from the AUR (simple and concise) and neither of those made any difference.
    I also tried installing the module with a cpan command (as root) but that gave me errors (which I'd rather not have to deal with unless I have to to fix this issue).
    cpan
    Terminal does not support AddHistory.
    cpan shell -- CPAN exploration and modules installation (v1.9800)
    Enter 'h' for help.
    cpan[1]> install data:dumper
    Reading '/root/.cpan/sources/authors/01mailrc.txt.gz'
    sh: /bin/gzip: No such file or directory
    ............................................................................DONE
    Reading '/root/.cpan/sources/modules/02packages.details.txt.gz'
    sh: /bin/gzip: No such file or directory
    Warning: Your /root/.cpan/sources/modules/02packages.details.txt.gz does not contain a Line-Count header.
    Please check the validity of the index file by comparing it to more
    than one CPAN mirror. I'll continue but problems seem likely to
    happen.
    Warning: Your /root/.cpan/sources/modules/02packages.details.txt.gz does not contain a Last-Updated header.
    Please check the validity of the index file by comparing it to more
    than one CPAN mirror. I'll continue but problems seem likely to
    happen.
    Subroutine AUTOLOAD redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 31.
    Subroutine import redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 49.
    Subroutine tv_interval redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 70.
    .Could not split line["H\"rW{0M4K\cV1=YFH\cV\c^rqv\c?\c]hKUe\cGL)4:4\cH]C\cPW`_|#MU4}\c_\c_%\cQ\cN\cX8M}%e\cR|\cG-K\c?~\cAM_cREx#AnvAZDi\$\c\!h\cZ\cI^uG0 )>TQ}p{Q[Ah\cNIK\cV+R"]
    Could not split line["-6W/GQp\cOm`>UwBP[5"]
    Could not split line["\@\cPI\cW,(zY2]uA!fVegsf\\V\cGfL\cHY4ki\cFvLgY\\\c\Ed\cK\cTe-\cC0\@d!?\cSU/\\G8\cAt?(\cY.L\cZ\cBrtl~s[4\@\cF\cWu*D>I"]
    Could not split line["hb\cSKA1\cCjX(w/K\cA"]
    Giving up parsing your /root/.cpan/sources/modules/02packages.details.txt.gz, too many errorsReading '/root/.cpan/sources/authors/01mailrc.txt.gz'
    sh: /bin/gzip: No such file or directory
    ............................................................................DONE
    Reading '/root/.cpan/sources/modules/02packages.details.txt.gz'
    sh: /bin/gzip: No such file or directory
    Warning: Your /root/.cpan/sources/modules/02packages.details.txt.gz does not contain a Line-Count header.
    Please check the validity of the index file by comparing it to more
    than one CPAN mirror. I'll continue but problems seem likely to
    happen.
    Subroutine AUTOLOAD redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 31.
    Subroutine import redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 49.
    Subroutine tv_interval redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 70.
    Warning: Your /root/.cpan/sources/modules/02packages.details.txt.gz does not contain a Last-Updated header.
    Please check the validity of the index file by comparing it to more
    than one CPAN mirror. I'll continue but problems seem likely to
    happen.
    Subroutine AUTOLOAD redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 31.
    Subroutine import redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 49.
    Subroutine tv_interval redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 70.
    .Could not split line["H\"rW{0M4K\cV1=YFH\cV\c^rqv\c?\c]hKUe\cGL)4:4\cH]C\cPW`_|#MU4}\c_\c_%\cQ\cN\cX8M}%e\cR|\cG-K\c?~\cAM_cREx#AnvAZDi\$\c\!h\cZ\cI^uG0 )>TQ}p{Q[Ah\cNIK\cV+R"]
    Could not split line["-6W/GQp\cOm`>UwBP[5"]
    Could not split line["\@\cPI\cW,(zY2]uA!fVegsf\\V\cGfL\cHY4ki\cFvLgY\\\c\Ed\cK\cTe-\cC0\@d!?\cSU/\\G8\cAt?(\cY.L\cZ\cBrtl~s[4\@\cF\cWu*D>I"]
    Could not split line["hb\cSKA1\cCjX(w/K\cA"]
    Giving up parsing your /root/.cpan/sources/modules/02packages.details.txt.gz, too many errorsTerminal does not support GetHistory.
    Lockfile removed.
    It says that the 02packages.details.txt.gz file is missing, but it looks like it's there -
    ls /root/.cpan/sources/modules/
    02packages.details.txt.gz 03modlist.data.gz
    I thought I might uninstall and reinstall perl but I got dependency errors and I'm not game to force an uninstall unless I get advice to do so -
    pacman -Rs perl
    checking dependencies...
    error: failed to prepare transaction (could not satisfy dependencies)
    :: automake: requires perl
    :: cdrkit: requires perl
    :: ddclient: requires perl
    :: git: requires perl>=5.14.0
    :: groff: requires perl
    :: gtk-doc: requires perl
    :: hspell: requires perl
    :: hyphen: requires perl
    :: imagemagick: requires perl
    :: lm_sensors: requires perl
    :: mod_perl: requires perl
    :: openssl: requires perl
    :: perl-algorithm-diff: requires perl
    :: perl-archive-zip: requires perl>=5.10.0
    :: perl-class-factory-util: requires perl
    :: perl-class-inspector: requires perl>=5.6.0
    :: perl-class-load: requires perl
    :: perl-class-methodmaker: requires perl>=5.10.0
    :: perl-class-singleton: requires perl>=5.10.0
    :: perl-convert-binhex: requires perl
    :: perl-data-dump: requires perl>=5.006
    :: perl-data-optlist: requires perl-scalar-list-utils
    :: perl-datetime-format-builder: requires perl
    :: perl-dbi: requires perl
    :: perl-digest-sha1: requires perl
    :: perl-email-date-format: requires perl
    :: perl-encode-locale: requires perl>=5.008
    :: perl-error: requires perl>=5.10.0
    :: perl-fcgi: requires perl
    :: perl-file-listing: requires perl>=5.8.8
    :: perl-file-slurp: requires perl>=5.14.0
    :: perl-font-afm: requires perl>=5.5.0
    :: perl-html-form: requires perl>=5.8.8
    :: perl-html-formattext: requires perl>=5.10.0
    :: perl-html-parser: requires perl>=5.12.1
    :: perl-html-tagset: requires perl>=5.10.0
    :: perl-html-tree: requires perl>=5.10.0
    :: perl-http-cache-transparent: requires perl>=5.10.0
    :: perl-http-cookies: requires perl>=5.8.8
    :: perl-http-daemon: requires perl>=5.8.8
    :: perl-http-date: requires perl>=5.8.8
    :: perl-http-message: requires perl>=5.8.8
    :: perl-http-negotiate: requires perl>=5.8.8
    :: perl-image-size: requires perl
    :: perl-io-socket-ssl: requires perl>=5.10.0
    :: perl-io-string: requires perl>=5.10.0
    :: perl-io-stringy: requires perl
    :: perl-libwww: requires perl>=5.8.1
    :: perl-lingua-en-numbers-ordinate: requires perl>=5.10.0
    :: perl-lingua-preferred: requires perl>=5.10.0
    :: perl-linux-pid: requires perl
    :: perl-list-moreutils: requires perl>=5.5.30
    :: perl-lwp-mediatypes: requires perl>=5.6.2
    :: perl-lwp-protocol-https: requires perl>=5.8.1
    :: perl-math-round: requires perl
    :: perl-mime-lite: requires perl
    :: perl-mime-types: requires perl
    :: perl-mozilla-ca: requires perl>=5.006
    :: perl-net-http: requires perl>=5.6.2
    :: perl-net-upnp: requires perl
    :: perl-package-stash: requires perl>=5.8.1
    :: perl-package-stash: requires perl-scalar-list-utils
    :: perl-package-stash-xs: requires perl>=5.8.1
    :: perl-params-classify: requires perl>=5.10.1
    :: perl-params-util: requires perl>=5.5.30
    :: perl-params-validate: requires perl
    :: perl-parse-recdescent: requires perl>=5.10.0
    :: perl-perl4-corelibs: requires perl>=5.006
    :: perl-soap-lite: requires perl
    :: perl-socket6: requires perl
    :: perl-sub-exporter: requires perl>=5.6.0
    :: perl-sub-install: requires perl
    :: perl-sub-uplevel: requires perl
    :: perl-term-readkey: requires perl
    :: perl-test-warn: requires perl>=5.10.0
    :: perl-text-iconv: requires perl
    :: perl-timedate: requires perl
    :: perl-tk: requires perl
    :: perl-unicode-string: requires perl
    :: perl-unicode-utf8simple: requires perl>=5.10.0
    :: perl-uri: requires perl>=5.10.0
    :: perl-www-robotrules: requires perl>=5.8.1
    :: perl-xml-dom: requires perl>=5.10.0
    :: perl-xml-namespacesupport: requires perl
    :: perl-xml-parser: requires perl
    :: perl-xml-regexp: requires perl
    :: perl-xml-sax: requires perl
    :: perl-xml-sax-base: requires perl
    :: perl-xml-simple: requires perl
    :: perl-xml-twig: requires perl
    :: perl-xml-writer: requires perl
    :: perl-xml-xpath: requires perl
    :: perl-yaml: requires perl>=5.10.0
    :: perl-yaml-syck: requires perl
    :: rsync: requires perl
    :: system-tools-backends: requires perl
    :: webmin: requires perl
    Any help would be greatly appreciated.
    Thanks
    belbo
    Last edited by belbo (2012-08-26 00:33:52)

    Thanks
    Results as follows -
    perl -MData::Dumper -e1
    Perl API version v5.14.0 of Data::Dumper does not match v5.16.0 at /usr/lib/perl5/site_perl/XSLoader.pm line 95.
    Compilation failed in require.
    BEGIN failed--compilation aborted.
    $ perl /mnt/Data/Scripts/Perl_Module_Check.pl wheremod Data::Dumper
    Data::Dumper /usr/lib/perl5/site_perl/Data/Dumper.pm
    Data::Dumper /usr/lib/perl5/core_perl/Data/Dumper.pm
    I ran the full delete command after backing up the directories (there was a lot in them)
    rm -rf /usr/lib/perl5/site_perl/* /usr/share/perl5/site_perl/*
    This resulted in a different missing module -
    $ sudo rm -rf /usr/lib/perl5/site_perl/* /usr/share/perl5/site_perl/*
    $ ~/.shepherd/shepherd --check
    ERROR:
    Mandatory module 'List::Compare' not found.
    I restored the backup and deleted only Data::Dumper but then got a missing XMLTV module -
    $ ~/.shepherd/shepherd --check
    $[ used in numeric lt (<) (did you mean $] ?) at /usr/share/perl5/site_perl/XML/Twig.pm line 7278.
    $[ used in numeric lt (<) (did you mean $] ?) at /usr/share/perl5/site_perl/XML/Twig.pm line 7284.
    $[ used in numeric lt (<) (did you mean $] ?) at /usr/share/perl5/site_perl/XML/Twig.pm line 7296.
    ERROR:
    Mandatory module 'XMLTV' not found.
    Following this post, I did the following, which returned no results -
    https://bbs.archlinux.org/viewtopic.php?pid=1110598
    $ sudo pacman -Qml | awk '/_perl\/auto\/.+\.so$/ { print $1 }' | uniq
    I then did the following -
    $ sudo rm -rf /usr/lib/perl5/*
    $ sudo pacman -S perl
    warning: perl-5.16.1-1 is up to date -- reinstalling
    resolving dependencies...
    looking for inter-conflicts...
    Targets (1): perl-5.16.1-1
    Total Installed Size: 48.33 MiB
    Net Upgrade Size: 0.00 MiB
    Proceed with installation? [Y/n]
    (1/1) checking package integrity [#######################################################] 100%
    (1/1) loading package files [#######################################################] 100%
    (1/1) checking for file conflicts [#######################################################] 100%
    (1/1) checking available disk space [#######################################################] 100%
    (1/1) upgrading perl [#######################################################] 100%
    $ ~/.shepherd/shepherd --check
    ERROR:
    Mandatory module 'XMLTV' not found.
    Oops. Then saw your next post saying not to do that as you'd lose the vendor modules, which I did. I had backed up so I restored, so still have the vendor stuff.
    In case its of any assistance, this is what my perl lib and share directories look like -
    [ben@htpc ~]$ ls /usr/lib/perl5
    core_perl site_perl site_perl.bak vendor_perl
    [ben@htpc ~]$ ls /usr/lib/perl5/site_perl
    Alien auto Data Encode encoding.pm Filter List Net perlfilter.pod SVN Term Unicode version.pm XML
    Attribute Compress Digest Encode.pm FCGI.pm filter-util.pl Math Params Storable.pm Sys Time version version.pod XSLoader.pm
    [ben@htpc ~]$ ls /usr/lib/perl5/vendor_perl
    Apache Attribute Crypt DateTimePP.pm dbixs_rev.pl GD.pm Image ModPerl Package SNMP.pm Text Unicode YAML
    Apache2 auto DateTime DBD Digest gv.pm JSON mod_perl2.pm Params Socket6.pm Tie Win32
    APR Bundle DateTime.pm DBI FCGI.pm gv.so Linux Net qd.pl SVN Tk Xfce4
    APR.pm Class DateTimePPExtra.pm DBI.pm GD HTML List NetSNMP RRDs.pm Term Tk.pm XML
    [ben@htpc ~]$ ls /usr/lib/perl5/core_perl
    arybase.pm Compress Cwd.pm DynaLoader.pm Fcntl.pm I18N List ODBM_File.pm perllocal.pod Socket.pm threads.pm
    attributes.pm Config_git.pl Data Encode File IO Math Opcode.pm POSIX.pm Storable.pm Tie
    auto Config_heavy.pl DB_File.pm Encode.pm Filter IO.pm MIME O.pm re.pm Sys Time
    B Config.pm Devel encoding.pm GDBM_File.pm IPC mro.pm ops.pm Scalar Text Unicode
    B.pm CORE Digest Errno.pm Hash lib.pm NDBM_File.pm PerlIO SDBM_File.pm threads
    [ben@htpc ~]$ ls /usr/share/perl5/
    core_perl site_perl vendor_perl
    [ben@htpc ~]$ ls /usr/share/perl5/core_perl
    AnyDBM_File.pm bignum.pm Config Digest.pm fields.pm IO Object PerlIO.pm Term utf8.pm
    App bigrat.pm constant.pm DirHandle.pm File IPC open.pm pod Test vars.pm
    Archive blib.pm CPAN Dumpvalue.pm FileCache.pm JSON overload Pod Test.pm version
    Attribute bytes_heavy.pl CPANPLUS dumpvar.pl FileHandle.pm less.pm overloading.pm Safe.pm Text Version
    autodie bytes.pm CPANPLUS.pm Encode filetest.pm Locale overload.pm Search Thread version.pm
    autodie.pm Carp CPAN.pm encoding Filter locale.pm Package SelectSaver.pm Thread.pm vmsish.pm
    AutoLoader.pm Carp.pm DBM_Filter English.pm FindBin.pm Log Params SelfLoader.pm Tie warnings
    AutoSplit.pm CGI DBM_Filter.pm Env.pm Getopt Math parent.pm sigtrap.pm Time warnings.pm
    autouse.pm CGI.pm DB.pm Exporter HTTP Memoize Parse sort.pm Unicode XSLoader.pm
    B _charnames.pm deprecate.pm Exporter.pm I18N Memoize.pm Perl strict.pm unicore
    base.pm charnames.pm Devel ExtUtils if.pm Module perl5db.pl subs.pm UNIVERSAL.pm
    Benchmark.pm Class diagnostics.pm Fatal.pm inc Net perlfaq.pm Symbol.pm User
    bigint.pm Compress Digest feature.pm integer.pm NEXT.pm PerlIO TAP utf8_heavy.pl
    [ben@htpc ~]$ ls /usr/share/perl5/site_perl
    Alien bigint.pm Bundle Class CPANPLUS DateTime ExtUtils HTML IPC Locale LWP.pm Math MythTV Object Try YAML
    Archive bignum.pm CGI Compress CPANPLUS.pm Encode File HTTP JSON LWP lwptut.pod Module MythTV.pm Test WWW YAML.pm
    Attribute bigrat.pm CGI.pm CPAN Date Env.pm Filter IO List lwpcook.pod Mail Mozilla Net Tie XML
    [ben@htpc ~]$ ls /usr/share/perl5/vendor_perl
    abbrev.pl bigrat.pl ctime.pl Error Font HTML LWP.pm open2.pl SOAP Time validate.pl
    Algorithm cacheout.pl Data Error.pm ftp.pl HTTP Mail open3.pl stat.pl timelocal.pl WWW
    Apache CGI Date exceptions.pl getcwd.pl Image Math Package Sub Tree XML
    Archive CGI.pm DateTime fastcwd.pl getopt.pl importenv.pl MIME Parse syslog.pl Try XMLRPC
    assert.pl chat2.pl Dist File getopts.pl IO Module Perl4 tainted.pl UDDI XMLTV
    auto Class dotsh.pl finddepth.pl Git Lingua Mozilla pwd.pl Term Unicode XMLTV.pm
    bigfloat.pl complete.pl Email find.pl Git.pm look.pl Net RRDp.pm termcap.pl URI YAML
    bigint.pl Convert Encode flush.pl hostname.pl LWP newgetopt.pl shellwords.pl Test URI.pm YAML.pm
    Any further assistance would be greatly appreciated.
    Thanks
    belbo
    Last edited by belbo (2012-08-27 13:40:14)

  • Create data dump in Oracle 10g

    Hi All,
    Can somebody please tell me how to create a data dump and then import the database structure and data from an existing database?
    The data dump would consist of the complete database structure, i suppose.
    I am using Oracle 10g Rel 1.
    Thanks in Advance,
    Tanuja

    Note, neither classical export nor datapump is really a database backup. You only perform a logical data extraction,but it's difficult to recreate a lost database based only on export dumps. Additionally you will have loss of data,because you only can restore the data as it was,when you did the export. Changes made later are lost. See export/import as an add-on,but backup your database physically. The best way to do this is (in my opinion) RMAN.
    http://download.oracle.com/docs/cd/B19306_01/backup.102/b14193/toc.htm
    Werner

  • Cisco User Data Dump

    When I manually run the Cisco User Data Dump, it creates the report without any issues.  When I try to schedule the task or if I run it from the command line,  "E:\Program Files (x86)\Cisco Systems\Unity Connection\User Data Dump\Unity Connection User Data Dump.exe" /Silent=[xx.xx.xx.xx], the file is created but never get data.  I followed the instructions about saving the configuration, I checked enable to run whether logged in or off, highest privilege, the startup directory is E:\Program Files (x86)\Cisco Systems\Unity Connection\User Data Dump and I am in the administrative group.  The request just seems to hang and waiting for some input.  Is there any other line commands I can use or is there another "hidden" or non-published setting that I need to do?
    If I look at the task scheduler, the Last Run Results say "The task is currently running (0x41301)".

    I am having the same issue. Was there a solution in place for this? I am up to date with the UDD. I have reviewed the help documents again and again. glamarca Did you ever find a solution?

  • I have a production mobile Flex app that uses RemoteObject calls for all data access, and it's working well, except for a new remote call I just added that only fails when running with a release build.  The same call works fine when running on the device

    I have a production mobile Flex app that uses RemoteObject calls for all data access, and it's working well, except for a new remote call I just added that only fails when running with a release build. The same call works fine when running on the device (iPhone) using debug build. When running with a release build, the result handler is never called (nor is the fault handler called). Viewing the BlazeDS logs in debug mode, the call is received and send back with data. I've narrowed it down to what seems to be a data size issue.
    I have targeted one specific data call that returns in the String value a string length of 44kb, which fails in the release build (result or fault handler never called), but the result handler is called as expected in debug build. When I do not populate the String value (in server side Java code) on the object (just set it empty string), the result handler is then called, and the object is returned (release build).
    The custom object being returned in the call is a very a simple object, with getters/setters for simple types boolean, int, String, and one org.23c.dom.Document type. This same object type is used on other other RemoteObject calls (different data) and works fine (release and debug builds). I originally was returning as a Document, but, just to make sure this wasn't the problem, changed the value to be returned to a String, just to rule out XML/Dom issues in serialization.
    I don't understand 1) why the release build vs. debug build behavior is different for a RemoteObject call, 2) why the calls work in debug build when sending over a somewhat large (but, not unreasonable) amount of data in a String object, but not in release build.
    I have't tried to find out exactly where the failure point in size is, but, not sure that's even relevant, since 44kb isn't an unreasonable size to expect.
    By turning on the Debug mode in BlazeDS, I can see the object and it's attributes being serialized and everything looks good there. The calls are received and processed appropriately in BlazeDS for both debug and release build testing.
    Anyone have an idea on other things to try to debug/resolve this?
    Platform testing is BlazeDS 4, Flashbuilder 4.7, Websphere 8 server, iPhone (iOS 7.1.2). Tried using multiple Flex SDK's 4.12 to the latest 4.13, with no change in behavior.
    Thanks!

    After a week's worth of debugging, I found the issue.
    The Java type returned from the call was defined as ArrayList.  Changing it to List resolved the problem.
    I'm not sure why ArrayList isn't a valid return type, I've been looking at the Adobe docs, and still can't see why this isn't valid.  And, why it works in Debug mode and not in Release build is even stranger.  Maybe someone can shed some light on the logic here to me.

  • SQL Dev 1.5.4+: Scripting DDL and data dumps?

    In SQL Dev 1.5.4, can I script a DDL and data dump? If not, what about 2.0? If not 2.0, has anyone requested this functionality so I can vote for it? I find it frustrating that, while doing a Database Export, I can't even pre-declare (e.g. save) the set of objects I want to dump; sometimes, you want to selectively dump, and it's a pain to hunt and peck and select just those you want to dump. Easy if you want to dump everything but 2-3 objects in a large schema. Not so easy if you only need, say, 20 out of 100 objects to be dumped (e.g. for domain or configuration tables--some subset of the whole schema).
    I'm really enjoying SQL Developer 1.5.4 by the way. Despite it's flaws, I'm pretty happy with it. Looking forward to 2.0 and beyond. Good work SQL Dev team.
    Thanks very much.
    Dana

    They're all command line tools, so they can all be wrapped up in a batch or shell script. Bummer you can't access them... Hope you find a better solutionThanks K. I should be getting Oracle 10g Express Edition on my desktop soon--critical because we don't have full access to the Development instance. It's like putting changes through a straw over to the DBAs. I'm not sure why Development is locked-down to developers, but that's the way it is.
    Any chance that Oracle 10g Express Edition comes with scriptable data pump binaries? Will still need authorization, but maybe that's one way to go. I hate trying to write my own Data Pump in Python or any other language. It's seems a bit absurd to me but I suppose there are reasons.
    Dana

  • Workload Analysis and Exception Analysys   Overview data n/a

    Hello,
    About Root cause analysis.
    In Workload Analysis and Exception Analysys, the KPI data, in right pane of overview tab,
    is always n/a.
    In the other tab, e.g.  ABAP and Java instance tab, the data is diplayed.
    So I think data could be properly stored in BI.
    Please give me some advice what setup is missing.
    Regards

    SAPNote1032461

  • Significance of Rescheduling Date and Exception group

    Hi Guys:
    could you somebody explain the significance of reschedule date and exception group in MD04 Transaction?.
    thanks
    sweth

    Hi,
    Rescheduling dates are not stored, they are calculated dynamically every time you execute or refresh MD04.
    Rescheduling date is based on the Stocks Requirement date.  It does not take into account the parameters in Procurement and Scheduling in the Material Master. In the net requirements calculation, the system checks whether warehouse stock or firmed receipts are available to cover requirements. If a material shortage exists, the system usually creates a new procurement proposal.
    Most catchable runtime errors are assigned to exception groups. For this go through the link. Thanking you.
    http://www.sapnet.ru/abap_docu/ABENSYSEXC-ERRKL.htm

  • IMessage won't activate my cellphone number. I have tried rebooting and turning the iMessage switches on and off. I still have data bundles and am able to access all social media platforms except iMessage. Why won't it activate my cellphone anymore? Help!

    iMessage won't activate my cellphone number. I have tried rebooting and turning the iMessage switches on and off. I still have data bundles and am able to access all social media platforms except iMessage. Why won't it activate my cellphone anymore? Help! Does anyone know any fixes for this problem ?

    See if this helps: http://support.apple.com/kb/ts4268

  • Hi all,  need data file and co file after single transport in 6.4 or 6.7

    hi all
    following is the requirement
    To process/compile the attached programs (given below) in 6.4/7 Kernel (SAP 6.4 version  or 6.7 version )and send a single transport (Data File and Co File).
    these data file and co file are flat files.
    CAN ANYONE HELP ME GETTING THESE FLAT FILES.   I need these flat files urgently.
    WHAT U HAVE TO DO IS:
    IF U HAVE 6.4 OR 6.7 VERSION, JUST TRANSPORT FOLLOWING 10 PROGRAMS IN SINGLE TRANSPORT IN TESTING SYSTEM OR DEVELOPEMENT SYSTEM AND AFTER GETTING DATA FILE AND CO FILE U CAN REVERT THE TRANSPORT.
    Programs files are as follows:  (total number of prog is 10)
    1.
    FUNCTION Z_3N_CKS_EXIST_USER .
    ""Local interface:
    *" IMPORTING
    *" VALUE(CKSUSRID) TYPE USR02-BNAME
    *" VALUE(CKSMDTID) TYPE USR02-MANDT DEFAULT SY-MANDT
    *" EXPORTING
    *" VALUE(RCODE) TYPE SY-SUBRC
    *" EXCEPTIONS
    *" USER_DONT_EXIST
    *" USER_EXISTS
    FUNCTION TO CHECK IF USER EXISTS
    CLEAR RCODE.
    CALL FUNCTION 'USER_EXISTS'
    EXPORTING
    BNAME = CKSUSRID
    CLIENT = CKSMDTID
    EXCEPTIONS
    USER_DONT_EXIST = 1
    USER_EXISTS = 0.
    RCODE = SY-SUBRC.
    ENDFUNCTION.
    2.
    FUNCTION Z_3N_CKS_LOCKSTATE.
    ""Local interface:
    *" IMPORTING
    *" VALUE(CKSUSRID) TYPE USR02-BNAME
    *" EXPORTING
    *" VALUE(RCODE) TYPE SY-SUBRC
    FUNCTION TO OBTAIN THE CURRENT LOCK STATUS FOR A USER
    *{ PASSGOAPR06
    *\DATA:LOCKSTATE(50) type c,
    *\C_LOCKED_BY_ADMIN like usr02-uflag.
    *\tables:usr02.
    DATA:LOCKSTATE(50) type c.
    *} PASSGOAPR06
    CLEAR: RCODE, LOCKSTATE.
    SELECT SINGLE * FROM USR02 WHERE BNAME = CKSUSRID.
    IF SY-SUBRC <> 0.
    RCODE = 01. "No such User
    EXIT.
    ENDIF.
    *{ PASSGOAPR06
    IF USR02-UFLAG Z C_LOCKED_BY_ADMIN AND
    USR02-UFLAG Z C_LOCKED_BY_FAILED_LOGON.
    LOCKSTATE = 'UNLOCKED'.
    RCODE = 00.
    ELSE.
    IF USR02-UFLAG O C_LOCKED_BY_FAILED_LOGON.
    LOCKSTATE = 'LOCKED_BY_FAILED_LOGON'.
    RCODE = 02.
    ENDIF.
    IF USR02-UFLAG O C_LOCKED_BY_ADMIN.
    LOCKSTATE = 'LOCKED_BY_ADMIN'.
    RCODE = 02.
    ENDIF.
    ENDIF.
    uflag = usr02-uflag.
    IF UFLAG Z C_LOCKED_BY_ADMIN AND
    UFLAG Z C_LOCKED_BY_FAILED_LOGON.
    LOCKSTATE = 'UNLOCKED'.
    RCODE = 00.
    ELSE.
    IF UFLAG O C_LOCKED_BY_FAILED_LOGON.
    LOCKSTATE = 'LOCKED_BY_FAILED_LOGON'.
    RCODE = 02.
    ENDIF.
    IF UFLAG O C_LOCKED_BY_ADMIN.
    LOCKSTATE = 'LOCKED_BY_ADMIN'.
    RCODE = 02.
    ENDIF.
    ENDIF.
    *} PASSGOAPR06
    ENDFUNCTION.
    3.
    FUNCTION Z_3N_CKS_PWDCHG_INITIAL.
    ""Local interface:
    *" IMPORTING
    *" VALUE(CKSUSRID) TYPE USR02-BNAME
    *" VALUE(CKSUSRPWD) TYPE RSYST-BCODE
    *" EXPORTING
    *" VALUE(RCODE) TYPE SY-SUBRC
    *" TABLES
    *" RETURN STRUCTURE BAPIRET2
    FUNCTION TO INITIALISE USERs PASSWORD, USER WILL BE FORCED
    TO CHANGE PASSWORD ON NEXT LOGIN
    CLEAR: USR02, RCODE.
    SELECT SINGLE * from USR02 WHERE BNAME = CKSUSRID.
    IF SY-SUBRC <> 0.
    RCODE = 01.
    else.
    CALL FUNCTION 'BAPI_USER_CHANGE'
    EXPORTING
    USERNAME = CKSUSRID
    PASSWORD = CKSUSRPWD
    PASSWORDX = 'X'
    TABLES
    RETURN = RETURN.
    loop at return.
    if return-type eq 'E' or return-type eq 'A'.
    rcode = 13.
    endif.
    endloop.
    endif.
    ENDFUNCTION.
    4.
    FUNCTION Z_3N_CKS_VERIFY_USER.
    ""Local interface:
    *" IMPORTING
    *" VALUE(CKSUSRID) TYPE RSYST-BNAME
    *" VALUE(CKSUSRPWD) TYPE RSYST-BCODE OPTIONAL
    *" EXPORTING
    *" VALUE(RCODE) TYPE SY-SUBRC
    FUNCTION TO VALIDATE A USER
    *{ PASSGOAPR06
    TABLES:USR02.
    CLEAR: USR02, RCODE.
    SELECT SINGLE * from USR02 WHERE BNAME = CKSUSRID.
    IF SY-SUBRC = 4.
    RCODE = 01. "no such user
    EXIT.
    ELSEIF CKSUSRPWD = SPACE.
    RCODE = 03. "invalid old password
    EXIT.
    ELSE.
    CALL FUNCTION 'SUSR_LOGIN_CHECK_RFC'
    EXPORTING
    BNAME = CKSUSRID
    PASSWORD = CKSUSRPWD
    EXCEPTIONS
    WAIT = 1
    USER_LOCKED = 2
    USER_NOT_ACTIVE = 3
    PASSWORD_EXPIRED = 4
    WRONG_PASSWORD = 5
    NO_CHECK_FOR_THIS_USER = 6
    INTERNAL_ERROR = 7
    OTHERS = 8
    CASE SY-SUBRC.
    WHEN '2'. RCODE = 02. "user disabled/blocked
    WHEN '3'. RCODE = 02. "user disabled/blocked
    WHEN '4'. RCODE = 03. "invalid old password
    WHEN '5'. RCODE = 03. "invalid old password
    WHEN '8'. RCODE = 12. "internal error
    ENDCASE.
    ENDIF.
    *} PASSGOAPR06
    ENDFUNCTION.
    5.
    *& Include ZMS01JTOP *
    PROGRAM MS01JTOP MESSAGE-ID 01 LINE-SIZE 132. "Berechtigungsdatenpflege
    13.08.93
    INCLUDE MS01CTP2.
    INCLUDE MS01CTCO.
    TABLES: XU200, XU213, XU310, XU350, XU390, XU400.
    TABLES: TSTC, TSP03, TPARA, TPARAT.
    TABLES: *USR01, *USR03, USR15.
    TABLES: SOUD, SOUD3.
    *ABLES: ZCSA, ADRS.
    *{ PASSGOAPR06
    TABLES: usr02.
    DATA: uflag type x.
    DATA: begin of return occurs 0.
    INCLUDE structure bapiret2.
    DATA: end of return.
    DATA calling_cksusrid like usr02-bname.
    DATA: init_pass like BAPIPWD.
    INCLUDE USER_CONSTANTS.
    *} PASSGOAPR06
    CONTROLS TC213 TYPE TABLEVIEW USING SCREEN 213.
    CONTROLS TC520 TYPE TABLEVIEW USING SCREEN 350.
    DATA: COPYOK TYPE I,
    RENAMEOK TYPE I,
    DATFM1,
    DATFM2,
    DATFM3,
    DATFM4,
    DCPFM1,
    DCPFM2,
    USERNAME LIKE USR01-BNAME,
    LOCK,
    UNLO,
    STATFLAG TYPE I VALUE 0,
    NAVIFLAG TYPE I VALUE 0,
    PARTOPIX TYPE I,
    PARFILL TYPE I,
    PARAMETER LIKE USR05-PARVA,
    PARID LIKE USR05-PARID,
    PARLOOP LIKE SY-STEPL,
    SHOW_ONLY VALUE ' ',
    INTPRO_LOADED TYPE I VALUE 0,
    EXT_SECURITY VALUE ' '.
    DATA: H_201_USGRP LIKE USGRP-USERGROUP,
    H_201_VALID TYPE C,
    CC201 LIKE SY-CUCOL VALUE 2,
    CR201 LIKE SY-CUROW VALUE 6,
    SAVE_LINE201 LIKE SY-LILLI VALUE 1,
    SAVE_LSIND201 LIKE SY-LSIND VALUE 1.
    DATA: OFFICENAME LIKE SOUD-USRNAM.
    DATA: BEGIN OF NAME_IN.
    INCLUDE STRUCTURE SOUD3.
    DATA: END OF NAME_IN.
    DATA: BEGIN OF NAME_OUT.
    INCLUDE STRUCTURE SOUD3.
    DATA: END OF NAME_OUT.
    DATA: BEGIN OF EMPTYPROF OCCURS 2.
    INCLUDE STRUCTURE USREF.
    DATA: END OF EMPTYPROF.
    DATA: BEGIN OF PROFILES OCCURS 10.
    INCLUDE STRUCTURE USREF.
    DATA: END OF PROFILES.
    DATA: MAXPAR TYPE I VALUE 300.
    DATA: BEGIN OF TABPAR OCCURS 300,
    PARID LIKE USR05-PARID,
    PARVA LIKE USR05-PARVA,
    END OF TABPAR.
    DATA: BEGIN OF DELTAB OCCURS 50,
    USGRP LIKE USR02-CLASS,
    END OF DELTAB.
    DATA: BEGIN OF ADDTAB OCCURS 50,
    USGRP LIKE USR02-CLASS,
    END OF ADDTAB.
    DATA: BEGIN OF ADDRESS_DATA.
    INCLUDE STRUCTURE SADRP_USR.
    DATA: END OF ADDRESS_DATA.
    DATA:
    CLEAR TYPE X VALUE '00'.
    *ATA: BEGIN OF ADRSDATEN.
    INCLUDE STRUCTURE ADRS.
    *ATA: END OF ADRSDATEN.
    06.10.95 Tosun
    DATA 930_FLAG.
    "$$
    6.
    FUNCTION Z_3N_CKS_LOCK_USER.
    ""Local interface:
    *" IMPORTING
    *" VALUE(CKSUSRID) TYPE USR02-BNAME
    *" EXPORTING
    *" VALUE(RCODE) TYPE SY-SUBRC
    CLEAR RCODE.
    *{ PASSGOAPR06
    *\ PERFORM LOCK_USER IN PROGRAM ZSAPMS01J USING CKSUSRID.
    *\ IF SY-SUBRC <> 0.
    *\ RCODE = SY-SUBRC.
    *\ EXIT.
    *\ ELSE.
    *\ COMMIT WORK.
    *\ ENDIF.
    the report (form) is dumping.
    so we try it with the correct BAPI
    CALL FUNCTION 'BAPI_USER_LOCK'
    EXPORTING
    USERNAME = CKSUSRID
    TABLES
    RETURN = return
    IF return-type <> 'S'.
    RCODE = '8'.
    rollback work.
    EXIT.
    ENDIF.
    *} PASSGOAPR06
    ENDFUNCTION.
    7.
    FUNCTION Z_3N_CKS_PWDCHG_DIRECT.
    ""Local interface:
    *" IMPORTING
    *" VALUE(CKSUSRID) TYPE USR02-BNAME
    *" VALUE(CKSUSRPWD) TYPE RSYST-BCODE
    *" EXPORTING
    *" VALUE(RCODE) TYPE SY-SUBRC
    *" TABLES
    *" RETURN STRUCTURE BAPIRET2
    FUNCTION TO CHANGE USERs PASSWORD
    Password is initialised to a fixed value,
    to avoid having to provide the valid old password
    If the password change fails, the change is
    rolled back - this is required because
    BAPI_USER_CHANGE does an internal commit.
    CLEAR: USR02, RCODE.
    DATA: L_TIME LIKE SY-UZEIT,
    ZLIN TYPE I.
    *{ PASSGOAPR06
    calling_cksusrid = cksusrid. "Save calling userid
    init_pass = 'INITPASS'.
    *} PASSGOAPR06
    do 1 times.
    SELECT SINGLE * from USR02 WHERE BNAME = CKSUSRID.
    IF SY-SUBRC <> 0.
    RCODE = 1.
    else.
    * Change login to initpass.
    CALL FUNCTION 'BAPI_USER_CHANGE'
    EXPORTING
    USERNAME = CKSUSRID
    *{ PASSGOAPR06
    *\ PASSWORD = 'INITPASS'
    PASSWORD = init_pass
    *} PASSGOAPR06
    PASSWORDX = 'X'
    TABLES
    RETURN = RETURN.
    * Evaluate return table, if not success, rcode = 13
    describe table return lines zlin.
    IF zlin > 0.
    IF return-type ne 'S'.
    rcode = 13.
    exit.
    ENDIF.
    ENDIF.
    * Wait 1 second, otherwise table ush02 gets the same key as before.
    * Not very good, but its the only way, because the wait up to
    * statement includes a db-commit.
    L_TIME = sy-uzeit.
    WHILE L_TIME = sy-uzeit.
    GET TIME.
    ENDWHILE.
    * Change 'INITPASS' to input login
    CALL FUNCTION 'SUSR_USER_CHANGE_PASSWORD_RFC'
    EXPORTING
    BNAME = CKSUSRID
    PASSWORD = 'INITPASS'
    NEW_PASSWORD = CKSUSRPWD
    NEW_BCODE = '0000000000000000'
    NEW_CODVN = ' '
    EXCEPTIONS
    CHANGE_NOT_ALLOWED = 1
    PASSWORD_NOT_ALLOWED = 2
    INTERNAL_ERROR = 3
    CANCELED_BY_USER = 4
    OTHERS = 5.
    case sy-subrc.
    when '0'. rcode = 0.
    when '1'. rcode = 2.
    when '2'. rcode = 4.
    when '3'. rcode = 12.
    when '4'. rcode = 2.
    when '5'. rcode = 12.
    ENDCASE.
    ENDIF.
    exit. "end of do 1 times "
    enddo.
    * rollback if it didn't work
    if not rcode is initial.
    rollback work.
    endif.
    ENDFUNCTION.
    8.
    FUNCTION Z_3N_CKS_UNLOCK_USER.
    ""Local interface:
    *" IMPORTING
    *" VALUE(CKSUSRID) TYPE USR02-BNAME
    *" EXPORTING
    *" VALUE(RCODE) TYPE SY-SUBRC
    CLEAR RCODE.
    *{ PASSGOAPR06
    *\ PERFORM UNLOCK_USER IN PROGRAM ZSAPMS01J USING CKSUSRID.
    *\ IF SY-SUBRC <> 0.
    *\ RCODE = SY-SUBRC.
    *\ EXIT.
    *\ ELSE.
    *\ COMMIT WORK.
    *\ ENDIF.
    the report (form) is dumping.
    so we try it with the correct BAPI
    CALL FUNCTION 'BAPI_USER_UNLOCK'
    EXPORTING
    USERNAME = CKSUSRID
    TABLES
    RETURN = return
    IF return-type <> 'S'.
    RCODE = '8'.
    rollback work.
    EXIT.
    ENDIF.
    *} PASSGOAPR06
    ENDFUNCTION.
    9.
    *& Include ZMS01JO10 *
    MS01JO10 Module before Output
    14.05.93
    MODULE D150_SELECT *
    Einen Eintrag aus der Liste uebernehmen. *
    MODULE D150_SELECT OUTPUT.
    IF SELE = 1.
    IF SY-LILLI < 3.
    MESSAGE S209.
    ELSE.
    IF USRFLAG = 10 AND SY-LILLI = 3.
    MESSAGE S209.
    ELSE.
    IF USRFLAG = 10.
    XU150-VON = SY-LISEL.
    ELSE.
    COUNTX = PUSR - 1.
    ASSIGN SY-LISEL+COUNTX(12) TO <TEXT>. " unicode
    WRITE <TEXT> TO XU150-VON.
    ENDIF.
    ENDIF.
    ENDIF.
    SELE = 0.
    ENDIF.
    IF FERTIG = 2.
    FCODE = 'BACK'.
    SUPPRESS DIALOG.
    ENDIF.
    ENDMODULE.
    MODULE D150_SETSTATUS *
    PF-Status setzen *
    MODULE D150_SETSTATUS OUTPUT.
    PERFORM SET_STATUS USING 150.
    XU150-SELPROF = XU150-SELFEST = XU150-SELADRE = XU150-SELPARA = 'X'.
    XU150-SELMENU = 'X'.
    ENDMODULE.
    MODULE D155_SETSTATUS *
    PF-Status setzen *
    MODULE D155_SETSTATUS OUTPUT.
    PERFORM SET_STATUS USING 155.
    ENDMODULE.
    MODULE D200_SELECT *
    Einen Eintrag aus der Liste ins Dynpro uebernehmen *
    MODULE D200_SELECT OUTPUT.
    IF SELE = 1.
    IF SY-LILLI < 3.
    MESSAGE S209.
    ELSE.
    XU200-XUSER = SY-LISEL.
    ENDIF.
    SELE = 0.
    ENDIF.
    IF FCODE2 = 'USER' OR FCODE2 = 'FEST' OR FCODE2 = 'ADRE' OR
    FCODE2 = 'PARA' OR FCODE2 = 'ADMI' OR FCODE2 = 'RESE' OR
    FCODE2 = 'N '.
    SUPPRESS DIALOG.
    ENDIF.
    PERFORM SET_STATUS USING 200.
    ENDMODULE.
    MODULE D213_VALOUT *
    Festwerte auf das Dynpro schreiben. *
    MODULE D213_VALOUT OUTPUT.
    DEL = 0.
    XU213-DIA = '.'.
    XU213-ODC = '.'.
    XU213-BDC = '.'.
    XU213-CPIC = '.'.
    XU213-BATCH = '.'.
    CASE USR02-USTYP.
    WHEN TYPDIA.
    XU213-DIA = 'X'.
    WHEN TYPBATCH.
    XU213-BATCH = 'X'.
    WHEN TYPCPIC.
    XU213-CPIC = 'X'.
    WHEN TYPBDC.
    XU213-BDC = 'X'.
    WHEN TYPODC.
    XU213-ODC = 'X'.
    ENDCASE.
    IF USR02-LTIME <> SPACE AND USR02-LTIME <> '000000'.
    LOOP AT SCREEN.
    CASE SCREEN-GROUP1.
    WHEN 'MOD'.
    SCREEN-INVISIBLE = '1'.
    SCREEN-INPUT = '0'.
    MODIFY SCREEN.
    ENDCASE.
    ENDLOOP.
    SET CURSOR FIELD 'USR02-CLASS'.
    IF F <> ' ' AND L <> 0.
    SET CURSOR FIELD F LINE L.
    ENDIF.
    CODEFLAG = 1.
    ELSE.
    CLEAR XU213-BCODE.
    CLEAR XU213-BCODE2.
    IF USR02-BCODE <> '0000000000000000' AND BCODE_C = SPACE.
    CODEFLAG = 0.
    ENDIF.
    IF CODEFLAG = -2.
    SET CURSOR FIELD 'XU213-BCODE'.
    MESSAGE S290.
    ELSE.
    SET CURSOR FIELD 'USR02-CLASS'.
    IF F <> ' ' AND L <> 0.
    SET CURSOR FIELD F LINE L.
    ENDIF.
    ENDIF.
    ENDIF.
    ENDMODULE.
    MODULE D213_SELECT *
    Einen Eintrag aus der Liste uebernehmen. *
    MODULE D213_SELECT OUTPUT.
    IF SELE = 1.
    IF SY-LILLI < 4.
    MESSAGE S209.
    ELSE.
    ASSIGN SY-LISEL(PROFLNG) TO <TEXT>.
    MOVE <TEXT> TO XU213-PROFILE.
    PERFORM AUTH_CHECK USING OBJ_PROF
    XU213-PROFILE SPACE ACT_INCLUDE RC.
    IF RC <> 0.
    MESSAGE S478 WITH XU213-PROFILE.
    ELSE.
    FOUND = 0.
    LOOP AT TABUSR.
    IF TABUSR-PROFILE = XU213-PROFILE.
    FOUND = 1.
    MESSAGE S268 WITH XU213-PROFILE.
    EXIT.
    ENDIF.
    ENDLOOP.
    IF FOUND = 0.
    PERFORM EXIST_USR10
    USING XU213-PROFILE AKTIVATED SPACE RC.
    CLEAR TABUSR.
    TABUSR-PROFILE = XU213-PROFILE.
    IF USR10-TYP = COLECTPROF.
    TABUSR-SAMPROF = 'X'.
    ENDIF.
    Profiletext lesen
    CLEAR USR11.
    SELECT SINGLE * FROM USR11
    WHERE LANGU = SY-LANGU
    AND PROFN = TABUSR-PROFILE
    AND AKTPS = AKTIVATED.
    TABUSR-PTEXT = USR11-PTEXT.
    APPEND TABUSR.
    XU213-FILL = XU213-FILL + 1.
    IF XU213-FILL >= MAXUSR.
    MESSAGE S269.
    ENDIF.
    UCHANGE = 1.
    PERFORM NOTSAVED.
    ENDIF.
    ENDIF.
    ENDIF.
    SELE = 0.
    ENDIF.
    PERFORM SET_STATUS USING 213.
    PERFORM MESSAGE.
    IF EXT_SECURITY <> '1'.
    LOOP AT SCREEN.
    IF SCREEN-GROUP1 = 'EXT'.
    SCREEN-INPUT = '0'.
    SCREEN-INVISIBLE = '1'.
    SCREEN-ACTIVE = '0'.
    MODIFY SCREEN.
    ENDIF.
    ENDLOOP.
    ENDIF.
    ENDMODULE.
    MODULE D213_PROFOUT *
    Profiles auf den Bildschirm ausgeben. *
    MODULE D213_PROFOUT OUTPUT.
    include <symbol>.
    COUNTX = XU213-TOPIX + SY-STEPL - 1. "Bild-oben-Pos. in Tab. feststell
    IF COUNTX <= XU213-FILL. "Am Ende der Tabelle ?
    READ TABLE TABUSR INDEX COUNTX. "Tab. lesen
    IF SY-SUBRC = 0.
    MOVE-CORRESPONDING TABUSR TO XU213. "Daten auf den Bildschirm
    xu213-samprof = sym_documents.
    ENDIF.
    ENDIF.
    USRLOOP = SY-LOOPC.
    ENDMODULE.
    MODULE D254_SUPPRESS *
    Dialog fuer Dynpro 254 unterdruecken *
    MODULE D254_SUPPRESS OUTPUT.
    SUPPRESS DIALOG.
    ENDMODULE.
    MODULE D310_SETSTATUS *
    PF-Status setzen *
    MODULE D310_SETSTATUS OUTPUT.
    IF SHOW_ONLY = SPACE.
    PERFORM SET_STATUS USING 310.
    IF STATFLAG = 1.
    LOOP AT SCREEN.
    IF SCREEN-NAME = 'USR01-CATTKENNZ '.
    AUTHORITY-CHECK OBJECT 'S_DEVELOP'
    ID 'DEVCLASS' DUMMY
    ID 'OBJTYPE' FIELD 'SCAT'
    ID 'OBJNAME' DUMMY
    ID 'P_GROUP' DUMMY
    ID 'ACTVT' FIELD '70'.
    IF SY-SUBRC <> 0.
    SCREEN-INPUT = 0.
    SCREEN-INVISIBLE = 1.
    MODIFY SCREEN.
    ENDIF.
    ENDIF.
    ENDLOOP.
    ENDIF.
    ELSE.
    PERFORM SET_STATUS USING 330.
    LOOP AT SCREEN.
    IF SCREEN-GROUP1 = 'RO '.
    SCREEN-INPUT = 0.
    MODIFY SCREEN.
    ENDIF.
    ENDLOOP.
    ENDIF.
    CLEAR FCODE.
    ENDMODULE.
    MODULE D310_FORMAT_OUT. *
    Datumsformat und Dezimalpunktformat entsprechend Daten aus usr01 *
    ankreuzen. *
    MODULE D310_FORMAT_OUT OUTPUT.
    XU310-DATFM1 = ' '.
    XU310-DATFM2 = ' '.
    XU310-DATFM3 = ' '.
    XU310-DATFM4 = ' '.
    XU310-DATFM5 = ' '.
    XU310-DCPFM1 = ' '.
    XU310-DCPFM2 = ' '.
    XU310-SPDB1 = ' '.
    XU310-SPDA1 = ' '.
    IF USR01-DATFM < 1 OR USR01-DATFM > 5.
    CALL 'C_SAPGPARAM'
    ID 'NAME' FIELD 'zcsa/moddatfm'
    ID 'VALUE' FIELD USR01-DATFM.
    ENDIF.
    CASE USR01-DATFM.
    WHEN 1.
    XU310-DATFM1 = 'X'.
    WHEN 2.
    XU310-DATFM2 = 'X'.
    WHEN 3.
    XU310-DATFM3 = 'X'.
    WHEN 4.
    XU310-DATFM4 = 'X'.
    WHEN 5.
    XU310-DATFM5 = 'X'.
    WHEN OTHERS.
    XU310-DATFM1 = 'X'.
    ENDCASE.
    IF USR01-DCPFM = ' '.
    XU310-DCPFM1 = 'X'.
    ELSE.
    XU310-DCPFM2 = 'X'.
    ENDIF.
    IF USR01-SPDB = 'G'.
    XU310-SPDB1 = 'X'.
    ENDIF.
    IF USR01-SPDA = 'D'.
    XU310-SPDA1 = 'X'.
    ENDIF.
    CLEAR TSP03.
    SELECT SINGLE * FROM TSP03
    WHERE PADEST = USR01-SPLD.
    ENDMODULE.
    MODULE D320_SETSTATUS *
    PF-Status setzen *
    MODULE D320_SETSTATUS OUTPUT.
    IF SHOW_ONLY = SPACE.
    PERFORM SET_STATUS USING 320.
    ELSE.
    PERFORM SET_STATUS USING 340.
    LOOP AT SCREEN.
    IF SCREEN-GROUP1 = 'RO '.
    SCREEN-INPUT = 0.
    MODIFY SCREEN.
    ENDIF.
    ENDLOOP.
    ENDIF.
    CLEAR FCODE.
    ENDMODULE.
    MODULE D330_SETSTATUS *
    PF-Status setzen *
    MODULE D330_SETSTATUS OUTPUT.
    PERFORM SET_STATUS USING 330.
    ENDMODULE.
    MODULE D340_SETSTATUS *
    PF-Status setzen *
    MODULE D340_SETSTATUS OUTPUT.
    PERFORM SET_STATUS USING 340.
    ENDMODULE.
    MODULE D350_SETSTATUS *
    PF-Status setzen *
    MODULE D350_SETSTATUS OUTPUT.
    IF SELE = 1.
    IF SY-LILLI < 3.
    MESSAGE S209.
    ELSE.
    TABPAR-PARID = SY-LISEL.
    APPEND TABPAR.
    PARFILL = PARFILL + 1.
    ENDIF.
    SELE = 0.
    ENDIF.
    IF SHOW_ONLY = SPACE.
    PERFORM SET_STATUS USING 350.
    ELSE.
    PERFORM SET_STATUS USING 360.
    LOOP AT SCREEN.
    IF SCREEN-GROUP1 = 'RO '.
    SCREEN-INPUT = 0.
    MODIFY SCREEN.
    ENDIF.
    ENDLOOP.
    ENDIF.
    CLEAR FCODE.
    ENDMODULE.
    MODULE D350_PAROUT *
    Parameter auf den Bildschirm ausgeben. *
    MODULE D350_PAROUT OUTPUT.
    COUNTX = PARTOPIX + SY-STEPL - 1. "Bild-oben-Pos. in Tab. feststell
    IF COUNTX <= PARFILL. "Am Ende der Tabelle ?
    READ TABLE TABPAR INDEX COUNTX. "Tab. lesen
    IF SY-SUBRC = 0.
    MOVE-CORRESPONDING TABPAR TO XU350. "Daten auf den Bildschirm
    SELECT SINGLE * FROM TPARAT
    WHERE SPRACHE = SY-LANGU
    AND PARAMID = TABPAR-PARID.
    ENDIF.
    ENDIF.
    PARLOOP = SY-LOOPC.
    ENDMODULE.
    MODULE D351_SETSTATUS *
    PF-Status setzen *
    *ODULE D351_SETSTATUS OUTPUT.
    IF SELE = 1.
    IF SY-LILLI < 3.
    MESSAGE S209.
    ELSE.
    XU350-PARID = SY-LISEL.
    ENDIF.
    SELE = 0.
    ENDIF.
    PERFORM SET_STATUS USING 352.
    SUPPRESS DIALOG.
    *NDMODULE.
    MODULE D360_SETSTATUS *
    PF-Status setzen *
    MODULE D360_SETSTATUS OUTPUT.
    PERFORM SET_STATUS USING 360.
    ENDMODULE.
    MODULE D390_STATUS *
    PF-Status setzen. *
    MODULE D390_STATUS OUTPUT.
    data uflag_x type x. " unicode
    SELECT SINGLE * FROM USR02
    WHERE BNAME = XU200-XUSER.
    uflag_x = USR02-UFLAG. " unicode
    IF SY-SUBRC <> 0.
    XU390-STATTEXT = ' Nicht vorhanden.'(222).
    ELSE.
    IF uflag_x Z YULOCK AND uflag_x Z YUSLOC. " unicode
    PERFORM SET_STATUS USING 390.
    XU390-STATTEXT = ' Nicht gesperrt. '(223).
    ELSE.
    IF uflag_x O YULOCK. " unicode
    PERFORM SET_STATUS USING 391.
    XU390-STATTEXT = ' Durch Falschanmeldungen gesperrt !!!'(224).
    ENDIF.
    IF uflag_x O YUSLOC. " unicode
    PERFORM SET_STATUS USING 391.
    XU390-STATTEXT = ' Durch Systemmanager gesperrt !!!'(225).
    ENDIF.
    ENDIF.
    ENDIF.
    LOCK = '.'.
    UNLO = '.'.
    ENDMODULE.
    MODULE D400_CLEAR_CODE *
    Passwortfeld loeschen. *
    MODULE D400_CLEAR_CODE OUTPUT.
    CLEAR XU400-NEWCODE.
    CLEAR XU400-NEWCODE1.
    ENDMODULE.
    MODULE D400_SETSTATUS *
    PF-Status setzen *
    MODULE D400_SETSTATUS OUTPUT.
    PERFORM SET_STATUS USING 400.
    ENDMODULE.
    MODULE D500_SUPPRESS *
    Dynpro unterdruecken *
    MODULE D500_SUPPRESS OUTPUT.
    SET PF-STATUS '0200'.
    SUPPRESS DIALOG.
    ENDMODULE.
    *& Module D214_SETSTATUS OUTPUT
    MODULE D214_SETSTATUS OUTPUT.
    SET PF-STATUS '0214'.
    SET TITLEBAR '214'.
    ENDMODULE. " D214_SETSTATUS OUTPUT
    *& Module D216_PROFOUT OUTPUT
    MODULE D216_PROFOUT OUTPUT.
    COUNTX = XU213-TOPIX2 + SY-STEPL - 1. "Bild-oben-Pos. in Tab. festst
    IF COUNTX <= XU213-FILL2. "Am Ende der Tabelle ?
    READ TABLE INTPRO2 INDEX COUNTX. "Tab. lesen
    IF SY-SUBRC = 0.
    MOVE-CORRESPONDING INTPRO2 TO XU213. "Daten auf den Bildschirm
    ENDIF.
    ENDIF.
    ENDMODULE. " D216_PROFOUT OUTPUT
    *& Module D504_STATUS OUTPUT
    MODULE D504_STATUS OUTPUT.
    SET PF-STATUS '0504'.
    SET TITLEBAR '604'.
    ENDMODULE. " D504_STATUS OUTPUT
    *& Module D217_SETSTATUS
    MODULE D217_SETSTATUS OUTPUT.
    PERFORM SET_STATUS USING 217.
    PERFORM MESSAGE.
    IF FERTIG = -1.
    SUPPRESS DIALOG.
    ENDIF.
    ENDMODULE. " D217_SELECT OUTPUT
    *& Module D202_SETSTATUS OUTPUT
    text *
    MODULE D202_SETSTATUS OUTPUT.
    SET PF-STATUS '0203'.
    SET TITLEBAR '203'.
    ENDMODULE. " D202_SETSTATUS OUTPUT
    *& Module D200_LESEN_MEMORY OUTPUT
    Schnittstelle mit RSUSR008
    call transaction su01 and skip first screen
    MODULE D200_LESEN_MEMORY OUTPUT.
    IMPORT FCODE FROM MEMORY ID 'OK_CODE'.
    EXPORT FCODE FROM SPACE TO MEMORY ID 'OK_CODE'.
    ENDMODULE. " D200_LESEN_MEMORY OUTPUT
    "$$
    10.
    *& Report ZSAPMS01J *
    23.10.92
    set extended check off.
    INCLUDE ZMS01JTOP.
    *INCLUDE MS01JTOP. "Datendeklarationen
    INCLUDE ZMS01JO10.
    *INCLUDE MS01JO10. "PBO-Module
    *INCLUDE ZMS01JI10.
    INCLUDE MS01JI10. "PAI-Module
    *INCLUDE ZMS01JF10.
    INCLUDE MS01JF10. "Forms
    *INCLUDE ZMS01JR10.
    INCLUDE MS01JR10. "Reporting
    *INCLUDE ZMS01CC10.
    INCLUDE MS01CC10. "Checks
    *INCLUDE ZMS01CD10.
    INCLUDE MS01CD10. "Datenzugriffe
    set extended check on.
    see there are in total 10 prog that should go in one trasport order. this will creat data file and co file. these are flat files. i want these flat files.
    do it in developement server or testing server and then delete it or revert it once you get data file or co file.
    i will greatful to u if you could send me these flat files.
    thanx in advance
    raj

    hi all,
    this can be done in 4.6 version also.
    thanx for spending time on this.
    thanx&regards
    raj

  • SAP Short Dumps and PCI Compliance

    We've run into an issue with our PCI Compliance audit around being able to see unencrypted credit cards in short dump messages in SAP.  Has anyone run into this issue?
    Only work around I've got at this point is to restrict all access to short dumps and require many documented signoffs before turning on and off access to a short dump.  This is pretty cumbersome, and still leaves a hole in my overall security.
    We've managed to purge restricted CC data from our XI logging, and done everything right with encryption, but this short dump issue just doesn't seem to have a solution.
    Can anyone help?  We're on 6.0.
    Thanks!

    Hi David,
    This is an interesting situation you have described. ABAP short-dumps or run-time errors as they are also known as, are unhandled exceptions during program execution. The conditions that cause such exceptions is unknown or cannot be handled at run-time. To help analyze what went wrong with the said program during execution, it is necessary for the dump to contain all possible information including data values passed between programs when the error occurs. Encryption of restricted data values is a program step in itself. If the dump were to occur after this step then of course it would contain encrypted CC info. Unfortunately in your case it exposes restricted CC info because the dump occurs BEFORE this step.
    I don't believe there is a way to prevent this from happening -- for the same reason that the program logic does not know at run-time how to "handle" the exception. If occurrences of such dumps is fairly common in your system, you may want to investigate the likely causes -- for example, missing or incorrect customization. Analyzing the short dumps will probably give you a clue. Your customization team may be able to identify a pre-condition that causes this unhandled exception. If this exception can then be handled (via a program change) that returns a meaningful error instead of a short dump you would be able to close the security hole. This however entails modification to SAP standard code. I don't usually recommend such changes, but given the sensitive nature of your data it may be worth consideration.
    I personally advocate restricted access to ST22. The steps you have undertaken to enforce this may be cumbersome despite efforts to keep it simple. I suppose that's the price we pay in administering the system. If you have not already done so, you may also want to ensure that short-dumps that contain restricted CC info are not saved (using the "Keep" feature in ST22) for easy retrieval at a later point in time or they are saved, it be available only to 'restricted eyes'. Short-dumps are normally saved in the system for 7 or 14 days (not sure of exact # of days). The bigger challenge in my opinion is: How do you prevent the restricted info from being viewed by the user who during the course of program/transaction execution encounters the said short dump? No amount of security controls around ST22 will mitigate this risk. The only option that remains is program change (as mentioned above). But to get there you first need to know what causes the exception.
    Regards.
    Ashutosh

  • How to Handle Large Dumps and Complex Calculations in OBIEE

    Hi,
    We are working on some reports in OBIEE and we are unable to optimize performace. Please consider following scenerios:
    1) Detail Report
    In this particular report we have essentaily data dump. That is there are 5 Dimentions and a Fact table and we are just selecting
    fields from various Dimentions and facts. But the problem is there are over 5000 records. I guess that is making report very slow.
    Another issue is when we click on PAGE Contols, they simply dont work, that is if we click on (>>) to see all records its takes lots
    of time but show only first 25 records. Any idea what is the problem.
    2) Calculated Report
    We have one calculated report where there are 7 calculated columns. Because every column contains different Measures and we have to
    use lots of Filter expressions
    Eg: Filter (Measure1 USING {@VarDate} BETWEEN DIM1.FromDate and Dim1.Todate AND {@VarDate} BETWEEN DIM2.FromDate and Dim2.Todate AND <SOME OTHER FILTERS> )
    The Granuality of data is at employee level and there are around 1388726 records in Fact table.
    Is there is any way to optimize above except to create a summary table instead?
    kindly guide us on above scenerios

    Hi,
    Thanks for your reply, I was trying things suggested, I found that the actual physical SQL (which I can get from "Manage Session" link) run in like 4-5 seconds in Query analyzer but: when I studied SQL carefully it does not contain some fiters which we I have applied in report.
    From example i have a filter in report <Measure.Present = 1 > but in Physical SQL i cannot see that filter. However, the results it gives on dashboard are correct that is after it applies the fillter.
    Any idea why is this happening?
    My guess is because above filter is not being applied at physical sql level thats why query fetches like 20000 recors in server cache and then it applies <Measure.Present = 1 >. Thats why it might be taking time ?
    Please give me your suggestions.
    Thanks !

  • Data usage and battery drain has skyrocketed with iOS 8.1.2 (maybe earlier)

    3 weeks ago my wife got a text message on her iPhone 5 from AT&T that she had used 90% of her data for the month.  30 seconds later, she got another one saying that she had used 100%.  Granted, her data plan is small, 200 MB.  But, for the last two years she has used as little as 17 MB in a month and her largest prior to this latest month was 49 mb.  This month she got to 380 mb before we were able to shut off her cell data.  Otherwise, a second chunk of 200 MB and another $15 would have been added to the month.
    We had noticed that her battery had been draining rather rapidly in standby mode with no background apps running when she didn't have access to WiFi.  We have now correlated the two results, High Cell data usage, Quick battery drain and very warm phone.  In case anyone is wondering about her iPhone 5 being one in the recall, it is.  I took it in and had it tested.  Apple says the battery is fine.  It does hold a charge well.  When all network communications are shut down on her phone and it is in standby mode, there is no noticeable loss of charge in 24 hours (our longest test).
    Since I have a grandfathered Unlimited Data Plan with AT&T, I have been doing some testing on my iPhone 5s.  Sure enough, I've seen similar results.  With nearly everything turned off except Cell Data, I've seen 483 MB used in 24 hours.  Of that total, 479 MB was used for Documents & Sync.  I even have everything turned off in my iCloud account, Location services are turned off, no push notifications etc...
    The best I can figure, this has been a problem since we installed 8.1.2 and yesterday I installed 8.1.3 resulting in no positive effect on this issue.
    At this rate of data usage and no access to WiFi, my wife would have to have a 15 Gb data plan to keep from exceeding her limit.  I don't think AT&T even offers one that big anymore.  I might have to switch to Sprint or T-Mobile if this doesn't get resolved.
    Any ideas would be greatly appreciated.  Thanks in advance for your shared insights.

    Hey this is Brian, bjmcmurry, under a different user ID.  If I logged in as bmcmurry, original post, I was just getting a server error and couldn't get to this discussion nor post a comment anywhere.
    Anyway, let me be as brief as I can.  After an extensive analysis of replicating the problem on another phone to eliminate it being a hardware specific issue and that the test phone has an unlimited data plan, I was able to stop the excessive data usage and consequential battery drain.  I believe I have a reasonable explanation of the problem, an iOS logic bug.
    I had everything under my control clamped down on the test phone to minimize data usage.  Also, I was forcing all data usage to use the cell towers.  I determined that I was burning at least 11 to 13 Mb per hour doing nothing important.  I literally had everything turned off except 'Settings' was allowed to use Cell data providing a avenue to log into the Apple ID. 
    The two phones had 2 things in common, they were using the same Apple ID and the same version of the OS.  One was a iPhone 5 and the other was a 5s.
    Here is a synopsis of my experience via my written evaluation of the Apple support system (Apple Rep name's have been removed):
    After a couple of hours on the phone with an Apple Care Senior support Analyst, it was suggested that I go to the Apple Store to 1) get my battery replace in the iPhone 5 since it qualified for the replacement program and 2) see if the Genius could provide me with any insight in the data usage issue and battery drain.
    I'm certainly glad I had multiple reasons to visit the closest Apple store, a 45 min drive.  I got the battery replaced, thanks for that. 
    Regarding the important issue to me that brought me to the Genius Bar, excessive Cell data usage, I thought the Genius did a wonderful job listening and documenting the problem we were experiencing.  Unfortunately, educating us on the use of the product or replacing our phone was not going to solve the problem.  He eventually figured it out, this was not a user error or broken hardware.  We surmised that the reason I was sent to him was to verify and document that there was a problem.  I clearly had a problem.  After writing copious notes on the issue, The Genius smartly referred me back the the AppleCare Senior Advisor.  So, back to AppleCare Senior Analyst I went. She, knowing this was not a simple problem to solve i.e. beyond her level of expertise, got an Apple Software engineer involved.
    My primary frustration is that this should have been escalated to software engineering right away.  My trip to see the Genius was, I believe, a waste of time for both me and Apple regarding this issue.  Diagnosis of this type of problem is well beyond the Genius’ knowledge and job.  The Genius did a very good job doing what he could within the scope of his job.  He just couldn’t solve this problem and had no one within Apple to which he had access for help.
    However, that initial consult with the engineer was a bit feeble.  Not only could I not talk to him/her directly, his/her focus was still working with the ‘Black Box' approach.  Having Analyst ask me to do stuff like restore the Phone was more like trying to find a 'work around' than 'identifying the real problem.'  Maybe restoring the phone was going to fix some corrupted data on the phone sidesteping the real problem, the handling of this condition.  The engineer did listen to the fact that I had restored the test phone, the 5s, and only added the use of my Apple ID.  Duh, been there!!!
    Having been an previous owner of a software development company that I started in 1980 and retired in 2001, I can tell you that using the 'Black Box' method of debugging is extremely inefficient.  Instead, having access to an ‘Activity Manager’ type app to identify in real time what process was blowing up the data flow would have saved me a lost afternoon at the Apple store and hours on the phone with tech support.
    Ironically, I may have stumbled onto a "work around" solution short of turning off Cell Data that stopped the excessive Cell Data usage and consequential battery drain.  Just prior to leaving the Apple Store, I used one of their Macs to log in my Apple ID on the id.apple.com website to ‘Manage my Apple ID’, I was forced to upgrade the security level of my password.  Afterward, my cell data usage seemed to drop dramatically.
    I told the AppleCare Senior Analyst that I may have stumbled onto a workaround and I suggested that I give it a couple of days of monitoring.  We'd talk then and I would have results to report.  She agreed. 
    Sure enough, my Document & Sync usage was back to normal, about 1/2 Mb per day.  Am I happy that the excessive Cell Data usage has stopped?  You bet!  But, there was no intuitive reason that an end user would know to log in on the ID website to unknowingly solve this problem.  As I use to say to my employees, that is not the ‘Macintosh Way.’  And, rebooting the the 'Microsoft Way.'
    When I reported my findings to the Analyst, we both concluded that data usage was back to normal.  She did report to me that the engineer did provide a laundry list of suggested activities to help solve the problem.  Remarkably, they were everything that I had already done and were pretty much what the Analyst and the Genius tried to do.  All reasonable tasks.  But clearly, the engineer hadn't even ready the notes on the case before responding.  And, none of the things suggested would have uncovered the real problem, likely a logic bug in iOS.
    My advice now is to have the software engineering team rule out that there was an unresolvable state that a background process was fiercely trying to handle.  The iOS does not seem to know how to intervene and prompt the user with a requirement to upgrade their password.  If this does end up being the case, I would call it a logic bug, a perpetual exception that is not being handled properly.
    Anyway, I hope this helps you.

  • Download using open data set and close data set

    can any body please send some sample pgm using open data set and close data set .the data should get downloaded in application server
    very simple pgm needed

    Hi Arun,
    See the Sample code for BDC using OPEN DATASET.
    report ZSDBDCP_PRICING no standard page heading
    line-size 255.
    include zbdcrecx1.
    *--Internal Table To hold condition records data from flat file.
    Data: begin of it_pricing occurs 0,
    key(4),
    f1(4),
    f2(4),
    f3(2),
    f4(18),
    f5(16),
    end of it_pricing.
    *--Internal Table To hold condition records header .
    data : begin of it_header occurs 0,
    key(4),
    f1(4),
    f2(4),
    f3(2),
    end of it_header.
    *--Internal Table To hold condition records details .
    data : begin of it_details occurs 0,
    key(4),
    f4(18),
    f5(16),
    end of it_details.
    data : v_sno(2),
    v_rows type i,
    v_fname(40).
    start-of-selection.
    refresh : it_pricing,it_header,it_details.
    clear : it_pricing,it_header,it_details.
    CALL FUNCTION 'UPLOAD'
    EXPORTING
    FILENAME = 'C:\WINDOWS\Desktop\pricing.txt'
    FILETYPE = 'DAT'
    TABLES
    DATA_TAB = it_pricing
    EXCEPTIONS
    CONVERSION_ERROR = 1
    INVALID_TABLE_WIDTH = 2
    INVALID_TYPE = 3
    NO_BATCH = 4
    UNKNOWN_ERROR = 5
    GUI_REFUSE_FILETRANSFER = 6
    OTHERS = 7.
    WRITE : / 'Condition Records ', P_FNAME, ' on ', SY-DATUM.
    OPEN DATASET P_FNAME FOR INPUT IN TEXT MODE.
    if sy-subrc ne 0.
    write : / 'File could not be uploaded.. Check file name.'.
    stop.
    endif.
    CLEAR : it_pricing[], it_pricing.
    DO.
    READ DATASET P_FNAME INTO V_STR.
    IF SY-SUBRC NE 0.
    EXIT.
    ENDIF.
    write v_str.
    translate v_str using '#/'.
    SPLIT V_STR AT ',' INTO it_pricing-key
    it_pricing-F1 it_pricing-F2 it_pricing-F3
    it_pricing-F4 it_pricing-F5 .
    APPEND it_pricing.
    CLEAR it_pricing.
    ENDDO.
    IF it_pricing[] IS INITIAL.
    WRITE : / 'No data found to upload'.
    STOP.
    ENDIF.
    loop at it_pricing.
    At new key.
    read table it_pricing index sy-tabix.
    move-corresponding it_pricing to it_header.
    append it_header.
    clear it_header.
    endat.
    move-corresponding it_pricing to it_details.
    append it_details.
    clear it_details.
    endloop.
    perform open_group.
    v_rows = sy-srows - 8.
    loop at it_header.
    perform bdc_dynpro using 'SAPMV13A' '0100'.
    perform bdc_field using 'BDC_CURSOR'
    'RV13A-KSCHL'.
    perform bdc_field using 'BDC_OKCODE'
    '/00'.
    perform bdc_field using 'RV13A-KSCHL'
    it_header-f1.
    perform bdc_dynpro using 'SAPMV13A' '1004'.
    perform bdc_field using 'BDC_CURSOR'
    'KONP-KBETR(01)'.
    perform bdc_field using 'BDC_OKCODE'
    '/00'.
    perform bdc_field using 'KOMG-VKORG'
    it_header-f2.
    perform bdc_field using 'KOMG-VTWEG'
    it_header-f3.
    **Table Control
    v_sno = 0.
    loop at it_details where key eq it_header-key.
    v_sno = v_sno + 1.
    clear v_fname.
    CONCATENATE 'KOMG-MATNR(' V_SNO ')' INTO V_FNAME.
    perform bdc_field using v_fname
    it_details-f4.
    clear v_fname.
    CONCATENATE 'KONP-KBETR(' V_SNO ')' INTO V_FNAME.
    perform bdc_field using v_fname
    it_details-f5.
    if v_sno eq v_rows.
    v_sno = 0.
    perform bdc_dynpro using 'SAPMV13A' '1004'.
    perform bdc_field using 'BDC_OKCODE'
    '=P+'.
    perform bdc_dynpro using 'SAPMV13A' '1004'.
    perform bdc_field using 'BDC_OKCODE'
    '/00'.
    endif.
    endloop.
    *--Save
    perform bdc_dynpro using 'SAPMV13A' '1004'.
    perform bdc_field using 'BDC_OKCODE'
    '=SICH'.
    perform bdc_transaction using 'VK11'.
    endloop.
    perform close_group.
    Hope this resolves your query.
    Reward all the helpful answers.
    Regards

  • Hi, my MacBook Pro cannot open Safari. nexpectedly while using the librooksbas.dylib plug-in" the report details the exception code as 'EXC_BAD_ACCESS (SIGSEGV)' and exception codes as: 'KERN_INVALID_ADDRESS at 0x0000000920

    Hi, my MacBook Pro cannot open Safari. It crashes and gives the message, ' Safari quit unexpectedly while using the librooksbas.dylib plug-in" the report details the exception code as 'EXC_BAD_ACCESS (SIGSEGV)' and exception codes as: 'KERN_INVALID_ADDRESS at 0x0000000920

    Remove "Rapport" by following the instructions on this page.
    Back up all data before making any changes.

Maybe you are looking for

  • Get lastest row for duplicates

    I have an Excel spreadsheet that has multiple rows for a specific customer.  I want to only keep the row for each customer that has the earliest shipping date but I want all the columns.  Can I do that with PowerPivot and if so, how?

  • BAPI_DOCUMENT_CREATE2 / BAPI_DOCUMENT_CHANGE2 - Create new version of a doc

    How can I create a new version of an existing document using one of these two BAPIs.  If it can't be done using one of these two (I tried several ways and don't think it is) then how can I accomplish this? Regards, Davis

  • Getting the validity period of a Document and send notification

    Dear Experts I have enabled Timebased publishing for some folders. Now I want to send a notification to the Administrator prior to expiry of a document. Can I achieve using a scheduler Task? If yes, if choose the CM system on which i want the schedul

  • Mail not sending using .mac accounts

    For years, I've had multiple .mac accounts. I've only had one listed as my official .mac account while the others I sent up in Mail as IMAP accounts. This has worked wonderfully this entire time. BUT NOW, none of my .mac accounts are sending! I've tr

  • Error using SQL data in javascript vars

    I have a javascript routine which queries sqlite for a series of top-level categories, then, for each category, executes another query, using an item ID which is a result of the first query as follows: // Get first level of categories table var stmt