Dbx uses 6GB (and counting) loading core file

I'm trying to load a core file in dbx, but it just keeps using memory (I killed it after 6GB resident size). It sits at the prompt
Reading conv
where conv is the executable.
The core file is 4.5M in size, the executable is 680kB. The program was compiled with gcc 3.4 as a 64-bit binary with debugging symbols - should dbx be able to load that? I've tried gdb from the sun.com Solaris Freeware page, however that seems unable to handle 64-bit programs.
We're running on Solaris 10, using dbx 7.5 2005/10/13 from Sun Studio 11.

Some questions:
- what happens if you just debug the executable w/o a corefile?
- Is the 680kB executable size a "text" size or "ls" size?
- Is your gcc producing stabs or DWARF?

Similar Messages

  • Selected file is not executable when loading core file

    Hello all
    im using sunstudio 12 on sun4u sparc using compiler version Sun C++ 5.9 SunOS_sparc Patch 124863-04 2008/04/16
    when i try to load core dump that is created from one of the executables with sunstudio 12 im getting this message
    "The Select file is not executable "
    and when i open the core file with command like dbx every thing is working fine

    here is all the output:
    Running "/opt/SUNWspro12/SUNWspro/bin/dmake  -f Makefile CONF=Debug" in /home/meiry/SunStudioProjects/Quote_1
    dmake: defaulting to parallel mode.
    See the man page dmake(1) for more information on setting up the .dmakerc file.
    sun8 --> 1 job
    /opt/SUNWspro12/SUNWspro/bin/dmake -f nbproject/Makefile-Debug.mk SUBPROJECTS= .build-conf
    sun8 --> 1 job
    mkdir -p build/Debug/Sun9-Solaris-Sparc
    CC    -c -g +w -o build/Debug/Sun9-Solaris-Sparc/disk.o disk.cc
    sun8 --> 2 jobs
    mkdir -p build/Debug/Sun9-Solaris-Sparc
    CC    -c -g +w -o build/Debug/Sun9-Solaris-Sparc/cpu.o cpu.cc
    sun8 --> Job output
    mkdir -p build/Debug/Sun9-Solaris-Sparc
    CC    -c -g +w -o build/Debug/Sun9-Solaris-Sparc/cpu.o cpu.cc
    (/home/meiry/SunStudioProjects/Quote_1)cpu.cc:
    "/opt/SUNWspro/prod/include/CC/Cstd/iostream.h", line 4: Error: istream is not a member of std.
    "/opt/SUNWspro/prod/include/CC/Cstd/iostream.h", line 5: Error: cin is not a member of std.
    "/opt/SUNWspro/prod/include/CC/Cstd/iostream.h", line 6: Error: ws is not a member of std.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 64: Error: Type name expected instead of "locale".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: While specializing "std::basic_streambuf<std::charT, std::traits>".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: Specialized in non-template code.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 65: Error: Type name expected instead of "locale".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: While specializing "std::basic_streambuf<std::charT, std::traits>".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: Specialized in non-template code.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 68: Error: streamsize is not defined.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: While specializing "std::basic_streambuf<std::charT, std::traits>".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: Specialized in non-template code.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 145: Error: Type name expected instead of "streamsize".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: While specializing "std::basic_streambuf<std::charT, std::traits>".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: Specialized in non-template code.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 146: Error: Type name expected instead of "streamsize".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: While specializing "std::basic_streambuf<std::charT, std::traits>".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: Specialized in non-template code.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 152: Error: Type name expected instead of "streamsize".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: While specializing "std::basic_streambuf<std::charT, std::traits>".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: Specialized in non-template code.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 167: Error: Type name expected instead of "locale".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: While specializing "std::basic_streambuf<std::charT, std::traits>".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: Specialized in non-template code.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 170: Error: Type name expected instead of "_RWSTDGuard".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: While specializing "std::basic_streambuf<std::charT, std::traits>".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: Specialized in non-template code.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 236: Error: Templates can only declare classes or functions.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 239: Error: Use ";" to terminate statements.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 239: Error: A declaration was expected instead of "return".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 239: Error: s is not defined.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 248: Error: basic_streambuf is not defined.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 248: Error: int_type is not defined.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 249: Error: Templates can only declare classes or functions.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 251: Error: Use ";" to terminate statements.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 251: Error: A declaration was expected instead of "if".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 251: Error: ")" expected instead of ">".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 251: Error: Unexpected ")" -- Check for matching parenthesis.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 251: Error: Operand expected instead of ")".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 253: Error: The function "gbump" must have a prototype.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 254: Error: traits is not defined.
    Compilation aborted, too many Error messages.
    :(/home/meiry/SunStudioProjects/Quote_1)cpu.cc
    *** Error code 1
    dmake: Fatal error: Command failed for target `build/Debug/Sun9-Solaris-Sparc/cpu.o'
    Current working directory /home/meiry/SunStudioProjects/Quote_1
    Waiting for 1 job to finish
    sun8 --> Job output
    mkdir -p build/Debug/Sun9-Solaris-Sparc
    CC    -c -g +w -o build/Debug/Sun9-Solaris-Sparc/disk.o disk.cc
    (/home/meiry/SunStudioProjects/Quote_1)disk.cc:
    "/opt/SUNWspro/prod/include/CC/Cstd/iostream.h", line 4: Error: istream is not a member of std.
    "/opt/SUNWspro/prod/include/CC/Cstd/iostream.h", line 5: Error: cin is not a member of std.
    "/opt/SUNWspro/prod/include/CC/Cstd/iostream.h", line 6: Error: ws is not a member of std.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 64: Error: Type name expected instead of "locale".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: While specializing "std::basic_streambuf<std::charT, std::traits>".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: Specialized in non-template code.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 65: Error: Type name expected instead of "locale".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: While specializing "std::basic_streambuf<std::charT, std::traits>".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: Specialized in non-template code.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 68: Error: streamsize is not defined.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: While specializing "std::basic_streambuf<std::charT, std::traits>".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: Specialized in non-template code.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 145: Error: Type name expected instead of "streamsize".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: While specializing "std::basic_streambuf<std::charT, std::traits>".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: Specialized in non-template code.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 146: Error: Type name expected instead of "streamsize".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: While specializing "std::basic_streambuf<std::charT, std::traits>".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: Specialized in non-template code.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 152: Error: Type name expected instead of "streamsize".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: While specializing "std::basic_streambuf<std::charT, std::traits>".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: Specialized in non-template code.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 167: Error: Type name expected instead of "locale".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: While specializing "std::basic_streambuf<std::charT, std::traits>".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: Specialized in non-template code.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 170: Error: Type name expected instead of "_RWSTDGuard".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: While specializing "std::basic_streambuf<std::charT, std::traits>".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 185:     Where: Specialized in non-template code.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 236: Error: Templates can only declare classes or functions.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 239: Error: Use ";" to terminate statements.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 239: Error: A declaration was expected instead of "return".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 239: Error: s is not defined.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 248: Error: basic_streambuf is not defined.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 248: Error: int_type is not defined.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 249: Error: Templates can only declare classes or functions.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 251: Error: Use ";" to terminate statements.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 251: Error: A declaration was expected instead of "if".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 251: Error: ")" expected instead of ">".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 251: Error: Unexpected ")" -- Check for matching parenthesis.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 251: Error: Operand expected instead of ")".
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 253: Error: The function "gbump" must have a prototype.
    "/opt/SUNWspro/prod/include/CC/Cstd/./streambuf", line 254: Error: traits is not defined.
    Compilation aborted, too many Error messages.
    :(/home/meiry/SunStudioProjects/Quote_1)disk.cc
    *** Error code 1
    dmake: Warning: Command failed for target `build/Debug/Sun9-Solaris-Sparc/disk.o'
    Current working directory /home/meiry/SunStudioProjects/Quote_1
    *** Error code 1
    dmake: Fatal error: Command failed for target `.build-impl'
    Build failed. Exit value 1.

  • Sender System and Counter in Receiver file

    Hi,
      I have a requirement of having 'Sender Sytem ID' and 'Counter' in receiver flat file.
    Please suggest, what is the best way to achieve this.
    Scenario :  ABAP --> PI ---> Legacy System
                      Proxy to File scenario
    regards
    Amol

    Hi,
    Please refer the following threads-
    System ID in the user defined function
    /message/686478#686478 [original link is broken]
    As mentioned in the threads, use the method
    System.getProperty("SAPSYSTEMNAME"); in udf for getting SYSID.
    Counter??? DO you mean record counter..something like that??
    You can use the standard count (statistic) or counter(arithmatic) functions for that.Please check the documentation for more help.

  • When using  Vikas' program to load large files  getting error

    Hello,
    I am using Vikas' program to load large data files: http://htmldb.oracle.com/pls/otn/f?p=38131:1
    This works fine, except when I click on the button to create table, then I get a "not found" error--
    failed to parse SQL query:
    ORA-00942: table or view does not exist
    What might cause this? I've checked grants and such and reviewed the code, but haven't figured it out...
    Thanks!

    Hello,
    I am using Vikas' program to load large data files: http://htmldb.oracle.com/pls/otn/f?p=38131:1
    This works fine, except when I click on the button to create table, then I get a "not found" error--
    failed to parse SQL query:
    ORA-00942: table or view does not exist
    What might cause this? I've checked grants and such and reviewed the code, but haven't figured it out...
    Thanks!

  • Using Applescript and Automator to manage files and folders

    Hi all.
    I need to make a simple action via applescript and-or automator to take the file it's been applied to (via the services) create a folder around it with the same name as the file.
    So far, I've searched the web, found some solutions but none are really working.
    Here's the script I found :
    set myFolder to findFolder()
    tell application "Finder" to set myFiles to files of myFolder as alias list
    repeat with aFile in myFiles
      set bName to my baseName(aFile)
      tell application "Finder"
      set folderExists to exists folder bName of myFolder
      if not folderExists then make new folder at myFolder with properties {name:bName}
      move aFile to folder bName of myFolder
      end tell
    end repeat
    ---------------- HANDLERS ----------------
    on baseName(myFile)
      tell application "System Events" to set {fileName, fileExt} to {name, name extension} of myFile
      return text 1 thru ((get offset of "." & fileExt in fileName) - 1) of fileName
    end baseName
    on findFolder()
      activate application "SystemUIServer"
      -- Bug pointed out by Lauri Ranta http://www.openradar.me/9406282
      tell application "Finder"
      activate
      set mySelection to (get selection)
      if mySelection ≠ {} then
      set mySelection to first item of (get selection)
      if mySelection's class = folder then
      set currentFolder to mySelection
      else if mySelection's class = document file then
      set currentFolder to parent of mySelection
      else if mySelection's class = alias file then
      set currentFolder to original item of mySelection
      end if
      else
      set currentFolder to target of front Finder window
      end if
      end tell
      return (currentFolder as alias)
    end findFolder
    And here's a page where they explain how to use automator and the Services to do what I <allmost> want, with some adaptation.
    But it still doesn't work.
    http://hbase.net/2011/08/17/move-selected-files-into-a-new-folder-using-applescr ipt-and-automator/
    So if anybody has an idea n how I could do this ?
    It It could either be a folder action (I drag and drop the files into a folder and boom, they get their folder around them)that I would set on a folder on a network drive, or locally, it doesn't matter, or a service (right clic on the file and boom folder around it.
    So if anyone could help with this I'd be grateful...

    Hi,
    Make an Automator Service (Service receives selected "Files or Folders"  in the "Finder.app").
    Use this script in the "Run AppleScript" action:
    on run {input, parameters}
        tell application "Finder"
            repeat with aFile in input
                if class of (item aFile) is not folder then
                    set {tName, fileExt} to {name, name extension} of aFile
                    if fileExt is not missing value and fileExt is not "" then
                        set tName to text 1 thru -((count fileExt) + 2) of tName
                        tell (get container of aFile)
                            if not (exists folder tName) then make new folder at it with properties {name:tName}
                            move aFile to folder tName
                        end tell
                    end if
                end if
            end repeat
        end tell
    end run
    This script work on files with a name extension.

  • DRM Error: 3305[ServerConnectionFailed]  and Error loading metadata file:Error #2048

    I am using the sample player in Adobe Access 4_0 SDK to play the encrypted video on FMS 5.0.
    1) When trying to play the " http://.../vod/WorldCup.mp4"  file and get the error "DRM Error: 3305[ServerConnectionFailed]  Load http://.../vod/WorldCup.mp4 "
    2) switch TVP mode. try to "Load DRM Metadata" and get the log " Error loading metadata file: Get URL:http://.../vod/WorldCup.mp4.metadata:Error #2048"

    I am using the sample player in Adobe Access 4_0 SDK to play the encrypted video on FMS 5.0.
    1) When trying to play the " http://.../vod/WorldCup.mp4"  file and get the error "DRM Error: 3305[ServerConnectionFailed]  Load http://.../vod/WorldCup.mp4 "
    2) switch TVP mode. try to "Load DRM Metadata" and get the log " Error loading metadata file: Get URL:http://.../vod/WorldCup.mp4.metadata:Error #2048"

  • How to use Lazy and Eager loading dynamically

    Hi
    I am using JPA.I have a table which has one to many realtionship with another table. I want lazy loading by default and eager loading in a speciific case.
    Can we change this fetch type at the time of query?
    Thanks
    Sateesh

    In EclipseLink 2.1 there is also a LoadPolicy that allows a relationship to be loaded on a query.
    query.setHint("eclipselink.load-group.attribute", "d.employees");
    The difference between it and fetch, is that it is loaded normally (or as configured) not fetch joined.
    James : http://www.eclipselink.org

  • Using one concurrent program loading 85 files to different tables

    I am looking to see if there is a way I can have one concurrent program and one control file to load all 85 files to 85 different tables at once. I really am hoping I don't have to write 85 different control files and 85 different concurrent programs. Can someone point me into the right direction or send examples. Thanks

    Hi,
    it is diffucul to imagine t ohave the same software to do 85 different things w/o writing 85 different procedures :)
    From my point of view, without defining all the details I'll do taht way:
    1) Standardize the file names pertient to the target tables, as source and control file (ex. table MYTABLE001, source mytable001.dat, control mytable001.ctl)
    2) Define a standard during the file production (ex. all fileds separated by comma and the field in the same format and order of the columns of the target table)
    3) write a shell script that scans the directory. match the *.dat files relative to yr patterns and execute a sql script that spools over a control file generated reading the Oracle dictionary informations about the table identified by the file). The shell should execute SQL*Loader or DataPump to load the file and check for correct result on, if necessary, historize the processed file or do anything U want like email supervisor and so on....
    U move the time spent to write 85 ctl and sh script to a more challenging (and complex) solution...
    Bye Max

  • I export using quicktime and it loads and acts like its working then when it is finished there is no exported file. It's as if nothing happened!

    I have used imovie in the past for editing gaming videos but due to a software issue i have had to make a return to imovie. The issue is when i try to export either just by exporting movie or exporting using quicktime, it acts like it is working then at the end it finishes and there is no sign it worked at all. no exported file in any folder and no error message. also there is an error when i try to optimize but i can do without that if need be. Thanks!

    I have used imovie in the past for editing gaming videos but due to a software issue i have had to make a return to imovie. The issue is when i try to export either just by exporting movie or exporting using quicktime, it acts like it is working then at the end it finishes and there is no sign it worked at all. no exported file in any folder and no error message. also there is an error when i try to optimize but i can do without that if need be. Thanks!

  • Using Anchors and Count in Report

    Hello,
    In a report I am using two rows and a count variable if data in 1st row is null then the 2nd row must move up when we anchor but it's not happening can any body help me.
    Thanks & Regards,
    Dinesh .M

    Thanks Azadi. I found the answer for the Spry tabs on
    http://foundationphp.com/tutorials/spry_url_utils.php.
    http://foundationphp.com/tutorials/spry_url_utils.php
    It's pretty involved, and includes downloading the latest
    javascript file for Spry 1.6.1 and some coding - but I tested and
    it works.

  • How do I bulk upload documents using PowerShell and extract metadata from file name?

    I have a requirement to upload a bunch of documents into a document library. Based on the content type, the rules of updating the metadata is different...the one giving me trouble is to extract the metadata from the file name. If I have a file name like
    "part1_part2_part3.pdf" how do I extract part1, part2, part3 and tag each document being uploaded into SharePoint, using PowerShell? I have searched and have not been able to find anything to get me started.
    Has anyone done this before? Or is there a blog I can take a look at? Thanks
     

    You will have to write a PS script encompassing this logic.
    Read files from the folder using
    Get-Item cmdlet
    Determine the content type based on the path or filename.
    Split the file name to extract the tag names.
    If the metadata fields in the content type is a managed metadata field, check whether the term exists and set it.
    Updating SharePoint Managed Metadata Columns
    with PowerShell
    This post is my own opinion and does not necessarily reflect the opinion or view of Slalom.

  • BW - Using up memory when loading full files

    Each night we have a number of jobs which load our data into BW. Most of the loads as full data loads, and therefore the system should wipe out the data that was there previously. However it seems to be keeping the information in the database. Not duplicating the ifnormation but using up the memory. I think we may have had a program which used to delete this out. Anyone any idea what this would be?
    We are on old version 2.1C

    Wendy,
    It is more of a maintenance task. COuple of things you can do.
    1. If you load data to targets also loading them thru PSA, after the load you can delete the PSA, say 'xxx' days old data be deleted from PSA. This will improve performance.
    2. When loading a full load via a single info package to multiple targets at the same time, this load will create memeory bottle necks, you could avoid this by using a staging area, such as load to a DSO and from there load into further targets using individual info packages. Though  the data same amount of data is in the memeory it handles and performs better.
    3. Where possible try to use 'Delta' if youe extractor supports it.
    Hope this helps, award points if useful.
    Alex(Arthur Samson)

  • Using HsGetValue and HsSetValue in same file

    Is it possible to have one worksheet use HsGetValue formulas to retrieve data from Essbase, a second worksheet used for adjusting data, a third worksheet that adds the first two and then a final worksheet with HsSetValue referencing the third worksheet? For some reason, when I refresh (not refresh all) the first worksheet it submits data from the HsSetValue worksheet first and then retrieves the data instead of retrieveing the data for the POV on the worksheet and then allow for adjusting to be submitted back using refresh on the HsSetValue worksheet. Can these two not be independent on separate worksheets?
    Thanks,
    Kenny

    hi,
    use BDC Session method.
    check below example it will help you
    if you want to record for two transations then you will use same method as one transation.For example if you want to execute two transations like VA01(sales order) and VL01N(delivery).Here first you can record for both transactions seperately and in the function module calling you will call BDC_INSERT two times
    First function module contains Tcode as VA01 and Bdcdata table as data for the va01 transaction and second table contains tcode as VL01N and Bdcdata table conatins data which contains the record data of the delivery recorded data.
    PERFORM open_batch_session USING p_sessa.
      LOOP AT ltcap_int
        WHERE NOT tot_inc IS initial.
        ltcap_int-adj = ltcap_int-adj * -1.
        CONCATENATE ltcap_int-fund '/' ltcap_int-cfc INTO zuonr.
        PERFORM fill_bdc_header.
        PERFORM fill_bdc_lines.
        PERFORM post_entries.
        PERFORM insert_batch_session USING 'FB01'.
        ltcap_int-adj = ltcap_int-adj * -1.
      ENDLOOP.
      PERFORM close_batch_session.
    perform start_batch_session using p_sessa.
    Budgets
      WRITE p_fy TO c_fy.
      PERFORM open_batch_session USING p_sessb.
      LOOP AT ltcap_int
        WHERE NOT tot_inc IS initial.
        PERFORM rollup_header.
        PERFORM rollup_line.
        PERFORM rollup_save.
        PERFORM insert_batch_session USING 'FR21'.
        IF NOT ltcap_int-gsef_amt IS INITIAL.
          PERFORM rollup_header_gsef.
          PERFORM rollup_line_gsef.
          PERFORM rollup_save.
          PERFORM insert_batch_session USING 'FR21'.
        ENDIF.
      ENDLOOP.
      PERFORM close_batch_session.
    perform start_batch_session using p_sessb.
    We can process more than 1 transactions in session method.
    For this we will create the internal tables equilant to transactions and
    Between BDC_open_group and BDC_close_group we will call the BDC_Insert the no.of transactions times and populate the internal tables which contains the data pertaining to diffrent transactions.

  • Using DISTINCT and COUNT together.

    So I have a query which I dicovered is not quite working as planned.
    SELECT DISTINCT NULL LINK,
           TO_CHAR(MONTHS_,'YYYY-MM') DATIME,
           sum(decode(CMS.CMS_NODE_OS.OS_TYPE,'Linux','1',0))"Linux" FROM
      ( SELECT ADD_MONTHS(TRUNC(to_date('28-FEB-2013','DD-MON-YYYY'),'MM'),-ROWNUM + 1) MONTHS_ FROM DUAL CONNECT BY LEVEL <=
      (SELECT CEIL( MONTHS_BETWEEN(to_date('28-FEB-2013','DD-MON-YYYY'),TO_DATE('01-FEB-2013','DD-MON-YYYY'))) FROM DUAL) )
    THE_TIMELINE
      LEFT JOIN CMS.CMS_NODE ON ( MONTHS_ BETWEEN CREATE_DT AND LAST_DAY(NVL(RETIRE_DT,MONTHS_)))
      LEFT JOIN CMS.CMS_NODE_OS ON CMS.CMS_NODE.NODE_NAME=CMS.CMS_NODE_OS.NODE_NAME
      WHERE CMS.CMS_NODE.NODE_ENV<>'Alias' and CMS.CMS_NODE_OS.OS_TYPE='Linux' GROUP BY (TO_CHAR(MONTHS_,'YYYY-MM')) ORDER BY DATIMEThe issue that I'm having is that on the LEFT JOIN of the CMS_NODE_OS table that there a mulitple join records. I just want to have one joined. because of the GROUP and SUM the DISTINCT is rendered null and void.(returns 961) when it should be like (737) records.
    How can I just link one CMS_NODE_OS record per CMS_NODE record...

    Hi,
    bostonmacosx wrote:
    ... The issue that I'm having is that on the LEFT JOIN of the CMS_NODE_OS table that there a mulitple join records. I just want to have one joined. because of the GROUP and SUM the DISTINCT is rendered null and void.(returns 961) when it should be like (737) records.That sounds like a Chasm Trap .
    How can I just link one CMS_NODE_OS record per CMS_NODE record...Which one?
    If you can write an ORDER BY clause that would put that row first, then you can use the analytic ROW_NUMBER function. Instead of joining to CMS_NODE_OS, join to a sub-query that has the relevant columns from CMS_NODE_OS, as well as ROW_NUMBER (let's call that column r_num). Include "AND r_num = 1" in the join condition.
    That's just one way to deal with a chasm trap. Depending on your data and your requirements, there may be better ways.
    I hope this answers your question.
    If not, post a little sample data (CREATE TABLE and INSERT statements, relevant columns only) for all tables involved, and also post the results you want from that data.
    Explain, using specific examples, how you get those results from that data.
    Always say which version of Oracle you're using (e.g., 11.2.0.2.0).
    See the forum FAQ {message:id=9360002}
    You'll get better answers faster if you always supply this information whenever you post a question.

  • I recently purchased a used iMac and cannot load iPhoto and other apps?

    I recently purchased a used iMac.  It was wiped clean and has the newest version of Mac OSX installed.
    When I go to the app store, it shows iPhoto and has an accept prompt next to it, though when i enter my apple id to accept, it says I cannot do that as it was previously accepted, obviously by original owner.  It says i could purchase it, but there is no available prompt to purchase it.
    Questions: Do i have to purchase?  How do i go about doing that?
    Thanks!

    If the machine was originally configured for Snow Leopard (10.6.x) or earlier  you should have gotten the original DVDs, one of which had the iLife apps on it. If it came with Lion already pre-installed you can try Lion Recovery, OS X Lion: About but I doubt it will download the iLife apps for you. In short you probably have to buy them or get the original DVDs from the seller. Another option is to call AppleCare and for a nominal cost they will replace the original media. You can find their number by using the AppleCare Contact Info link.

Maybe you are looking for

  • Background job- getting printed automatically

    In my BDC Program i have submitted  Session for Background Processing using Submit RSBDCSUB.......... my code is also given below..The problem is after the user executes the program, background job is automatically going to his Local printer and gett

  • How come I need to render EVERY TIME I apply an effect in Premiere Pro CC?

    I am using Premiere Pro CC version 7.2.2 (33) on a 2.9 GHz Intel Core i5 iMac with 24GB of memory and an NVIDIA GeForce GTX 660M 512 MB graphics card. The footage I am importing is ProRes 422 1920x1080 square pixel 24 frames progressive footage. When

  • Error u2019Select a Payment Reasonu2019  when I approve the document in HRAS"

    Hi I get a error ,u2019 Select a Payment Reasonu2019  when I approve the document in HRAS . This is coming from IT14. But we are not using this Payment Reason field in Infotype 14 anywhere. We are in Release 604 and Support Pack 12. Please advise.

  • SP-Stacks 10 theough JSPM in NW2004s

    Hi,    I have installed NW2004s and my Support Package Stack level is 9 & now i am tring to update it to SP-Stacks through JSPM but at check Queue i am error in  sap.com/SAP KERNEL & Status is REVISE. Details Messages: Kernel with support package lev

  • Starting and stopping triggered samples

    The ESX24 Sampler claims to be a plug-in that functions exactly like a sampler should. Previous to Logic Pro 7, I used a SP-606 sampler that had three different ways a sample could be played: 1. drum - this function played the sample as long as the b