W_YEAR_D Script to configure Year Data in Apps Database : OBIA

Hi Friends,
We want to configure DAY, MONTH, YEAR... in my Apps database. As of now we have Scripts for W_DAY_D, W_MONTH_D and mentioned below. NOw , we need same kind of scripts for W_YEAR_D. We are planning to bypass Informatica and DAC, So we need to configure them in Apps DAtabase to get the results similar to OBIA.
W_DAY_D
CREATE TABLE W_DAY_D AS
(SELECT
ROWNUM ROW_WID,
CurrDate AS Day_ID,
to_number(to_char(CurrDate, 'YYYYMMDD')) as DAY_ID_YYYYMMDD,
--1 AS Day_Time_Span,
--CurrDate                AS Day_End_Date,
TO_CHAR(CurrDate,'Day') AS Week_Day_Full,
TO_CHAR(CurrDate,'DY') AS Week_Day_Short,
TO_NUMBER(TRIM(leading '0'
FROM TO_CHAR(CurrDate,'D'))) AS Day_Num_of_Week,
TO_NUMBER(TRIM(leading '0'
FROM TO_CHAR(CurrDate,'DD'))) AS Day_Num_of_Month,
TO_NUMBER(TRIM(leading '0'
FROM TO_CHAR(CurrDate,'DDD'))) AS Day_Num_of_Year,
UPPER(TO_CHAR(CurrDate,'Mon')
|| '-'
|| TO_CHAR(CURRDATE,'YYYY')) AS MONTH_ID,
TO_CHAR(CurrDate,'Mon')
|| '-'
|| TO_CHAR(CURRDATE,'YY') AS MONTH_SHORT_ID,
TO_CHAR(CurrDate,'Mon')
|| ' '
|| TO_CHAR(CurrDate,'YYYY') AS Month_Short_Desc,
RTRIM(TO_CHAR(CurrDate,'Month'))
|| ' '
|| TO_CHAR(CurrDate,'YYYY') AS Month_Long_Desc,
TO_CHAR(CurrDate,'Mon') AS Month_Short,
TO_CHAR(CurrDate,'Month') AS Month_Long,
TO_NUMBER(TRIM(leading '0'
FROM TO_CHAR(CurrDate,'MM'))) AS Month_Num_of_Year,
'Q'
|| UPPER(TO_CHAR(CurrDate,'Q')
|| '-'
|| TO_CHAR(CurrDate,'YYYY')) AS Quarter_ID,
'Q' || UPPER(TO_CHAR(CurrDate,'Q')) AS Quarter_Short,
TO_NUMBER(TO_CHAR(CurrDate,'Q')) AS Quarter_Num_of_Year,
CASE
WHEN TO_NUMBER(TO_CHAR(CurrDate,'MM')) = 1 THEN 1
WHEN TO_NUMBER(TO_CHAR(CurrDate,'MM')) = 2 THEN 2
WHEN TO_NUMBER(TO_CHAR(CurrDate,'MM')) = 3 THEN 3
WHEN TO_NUMBER(TO_CHAR(CurrDate,'MM')) = 4 THEN 1
WHEN TO_NUMBER(TO_CHAR(CurrDate,'MM')) = 5 THEN 2
WHEN TO_NUMBER(TO_CHAR(CurrDate,'MM')) = 6 THEN 3
WHEN TO_NUMBER(TO_CHAR(CurrDate,'MM')) = 7 THEN 1
WHEN TO_NUMBER(TO_CHAR(CurrDate,'MM')) = 8 THEN 2
WHEN TO_NUMBER(TO_CHAR(CurrDate,'MM')) = 9 THEN 3
WHEN TO_NUMBER(TO_CHAR(CurrDate,'MM')) = 10 THEN 1
WHEN TO_NUMBER(TO_CHAR(CurrDate,'MM')) = 11 THEN 2
WHEN TO_NUMBER(TO_CHAR(CurrDate,'MM')) = 12 THEN 3
END AS Quarter_Num_of_Month,
CASE
WHEN TO_NUMBER(TO_CHAR(CurrDate,'Q')) <= 2
THEN 1
ELSE 2
END AS half_num_of_year,
CASE
WHEN TO_NUMBER(TO_CHAR(CurrDate,'Q')) <= 2
THEN 'H'
|| 1
|| '-'
|| TO_CHAR(CurrDate,'YYYY')
ELSE 'H'
|| 2
|| '-'
|| TO_CHAR(CurrDate,'YYYY')
END AS half_of_year_id,
TO_CHAR(CurrDate,'YYYY') AS Year_ID
FROM
(SELECT level n,
-- Calendar starts at the day after this date.
TO_DATE('31/12/2000','DD/MM/YYYY') + NUMTODSINTERVAL(level,'DAY') CurrDate
FROM dual
-- Change for the number of days to be added to the table.
CONNECT BY level <= 9131
W_MONTH_D
CREATE TABLE W_MONTH_D
AS
SELECT ROWNUM AS ROW_WID,
B.MONTH_ID AS MONTH_ID,
B.MONTH_SHORT_DESC,B.MONTH_LONG_DESC,B.MONTH_SHORT,B.MONTH_LONG,B.MONTH_NUM_OF_YEAR,B.QUARTER_ID,B.QUARTER_SHORT,B.QUARTER_NUM_OF_YEAR,
B.QUARTER_NUM_OF_MONTH,B.HALF_NUM_OF_YEAR,B.HALF_OF_YEAR_ID,B.YEAR_ID
FROM
(SELECT MONTH_ID,MONTH_SHORT_DESC,MONTH_LONG_DESC,MONTH_SHORT,MONTH_LONG,MONTH_NUM_OF_YEAR,QUARTER_ID,QUARTER_SHORT,QUARTER_NUM_OF_YEAR,
QUARTER_NUM_OF_MONTH,HALF_NUM_OF_YEAR,HALF_OF_YEAR_ID,YEAR_ID FROM W_DAY_D
GROUP BY MONTH_ID,MONTH_SHORT_DESC,MONTH_LONG_DESC,MONTH_SHORT,MONTH_LONG,MONTH_NUM_OF_YEAR,QUARTER_ID,QUARTER_SHORT,QUARTER_NUM_OF_YEAR,
QUARTER_NUM_OF_MONTH,HALF_NUM_OF_YEAR,HALF_OF_YEAR_ID,YEAR_ID
ORDER BY YEAR_ID,MONTH_NUM_OF_YEAR) B
Please help on this asap. Its really urgent requirement..
Thanks in Advance.
Raghu Nagadasari

messDate.setGregorianChange(theResult..getDate(2)); //I GET ABSOLUTELY NO DATE HERE
This doesn't do what you think it does. Check the API documentation.
http://java.sun.com/j2se/1.4.1/docs/api/java/util/GregorianCalendar.html
Try this instead:
messDate.setTime(theResult.getDate(2));

Similar Messages

  • How do I integrate data and apps migrated from my old MacBook to my newish MacBook Pro?

    Data and applications were migrated successfully from my old MacBook to my newish (three years old) MacBook Pro. However, I can get to the data and apps from only one computer at a time. How do I integrate documents from the old machine to the new? How do I include apps I used before such as Office in the configuration of the new computer?
    I also do not know how to switch back and forth from one device to the other (I have to choose when I boot up), which makes this situation all the more inconvenient.

    I don't know how Firefox installs, but if you just dragged it into the Applications folder, you can drag it to the Trash and empty it.
    If it had an installer, then you need to use the uninstaller provided by the developer.
    What are the other "files that may not be removed?" It doesn't sound like much of a good idea.

  • [Forum FAQ] How to configure a Data Driven Subscription which get multi-value parameters from one column of a database table?

    Introduction
    In SQL Server Reporting Services, we can define a mapping between the fields that are returned in the query to specific delivery options and to report parameters in a data-driven subscription.
    For a report with a parameter (such as YEAR) that allow multiple values, when creating a data-driven subscription, how can we pass a record like below to show correct data (data for year 2012, 2013 and 2014).
    EmailAddress                             Parameter                      
    Comment
    [email protected]              2012,2013,2014               NULL
    In this article, I will demonstrate how to configure a Data Driven Subscription which get multi-value parameters from one column of a database table
    Workaround
    Generally, if we pass the “Parameter” column to report directly in the step 5 when creating data-driven subscription.
    The value “2012,2013,2014” will be regarded as a single value, Reporting Services will use “2012,2013,2014” to filter data. However, there are no any records that YEAR filed equal to “2012,2013,2014”, and we will get an error when the subscription executed
    on the log. (C:\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\LogFiles)
    Microsoft.ReportingServices.Diagnostics.Utilities.InvalidReportParameterException: Default value or value provided for the report parameter 'Name' is not a valid value.
    This means that there is no such a value on parameter’s available value list, this is an invalid parameter value. If we change the parameter records like below.
    EmailAddress                        Parameter             Comment
    [email protected]         2012                     NULL
    [email protected]         2013                     NULL
    [email protected]         2014                     NULL
    In this case, Reporting Services will generate 3 reports for one data-driven subscription. Each report for only one year which cannot fit the requirement obviously.
    Currently, there is no a solution to solve this issue. The workaround for it is that create two report, one is used for view report for end users, another one is used for create data-driven subscription.
    On the report that used create data-driven subscription, uncheck “Allow multiple values” option for the parameter, do not specify and available values and default values for this parameter. Then change the Filter
    From
    Expression:[ParameterName]
    Operator   :In
    Value         :[@ParameterName]
    To
    Expression:[ParameterName]
    Operator   :In
    Value         :Split(Parameters!ParameterName.Value,",")
    In this case, we can specify a value like "2012,2013,2014" from database to the data-driven subscription.
    Applies to
    Microsoft SQL Server 2005
    Microsoft SQL Server 2008
    Microsoft SQL Server 2008 R2
    Microsoft SQL Server 2012
    Please click to vote if the post helps you. This can be beneficial to other community members reading the thread.

    For every Auftrag, there are multiple Position entries.
    Rest of the blocks don't seems to have any relation.
    So you can check this code to see how internal table lt_str is built whose first 3 fields have data contained in Auftrag, and next 3 fields have Position data. The structure is flat, assuming that every Position record is related to preceding Auftrag.
    Try out this snippet.
    DATA lt_data TYPE TABLE OF string.
    DATA lv_data TYPE string.
    CALL METHOD cl_gui_frontend_services=>gui_upload
      EXPORTING
        filename = 'C:\temp\test.txt'
      CHANGING
        data_tab = lt_data
      EXCEPTIONS
        OTHERS   = 19.
    CHECK sy-subrc EQ 0.
    TYPES:
    BEGIN OF ty_str,
      a1 TYPE string,
      a2 TYPE string,
      a3 TYPE string,
      p1 TYPE string,
      p2 TYPE string,
      p3 TYPE string,
    END OF ty_str.
    DATA: lt_str TYPE TABLE OF ty_str,
          ls_str TYPE ty_str,
          lv_block TYPE string,
          lv_flag TYPE boolean.
    LOOP AT lt_data INTO lv_data.
      CASE lv_data.
        WHEN '[Version]' OR '[StdSatz]' OR '[Arbeitstag]' OR '[Pecunia]'
             OR '[Mita]' OR '[Kunde]' OR '[Auftrag]' OR '[Position]'.
          lv_block = lv_data.
          lv_flag = abap_false.
        WHEN OTHERS.
          lv_flag = abap_true.
      ENDCASE.
      CHECK lv_flag EQ abap_true.
      CASE lv_block.
        WHEN '[Auftrag]'.
          SPLIT lv_data AT ';' INTO ls_str-a1 ls_str-a2 ls_str-a3.
        WHEN '[Position]'.
          SPLIT lv_data AT ';' INTO ls_str-p1 ls_str-p2 ls_str-p3.
          APPEND ls_str TO lt_str.
      ENDCASE.
    ENDLOOP.

  • Script cannot find perl Data:Dumper module

    Good day. I live in Australia and use a script called Shepherd to update my mythtv program listings.
    After upgrading a couple of days ago (first time in about 6 months) Shepherd has stopped working.
    Shepherd doesn't seem to find the perl data-dumper module, which is part of the base perl install.
    All my packages are up to date as of this post, and I am on the current kernel -
    uname -r
    3.4.9-1-ARCH
    ~/.shepherd/shepherd --check
    ERROR:
    Mandatory module 'Data::Dumper' not found.
    Please see the Wiki at <a href='http://svn.whuffy.com/wiki/Installation' class='bbc_url' title='External link' rel='nofollow external'>http://svn.whuffy.co...ki/Installation</a>
    for details on how to install this module.
    Shepherd is updating ~/.shepherd/output.xmltv (the listings information scraped from the net) but evidently needs the dumper module to post the data into MythTV.
    I'm not sure if it has something to do with this, which I don't properly understand -
    The current Arch Linux default perl installation installs updates to core modules into the perl core directories, creating file conflicts.  Examples include modules such as Data::Dumper and version.
    which is mentioned here.
    I appear to have it installed as part of the perl base package -
    sudo pacman -Qi perl
    Name : perl
    Version : 5.16.1-1
    URL : <a href='http://www.perl.org' class='bbc_url' title='External link' rel='nofollow external'>http://www.perl.org</a>
    Licenses : GPL PerlArtistic
    Groups : base
    Provides : perl-archive-extract=0.58 perl-archive-tar=1.82 perl-attribute-handlers=0.93 perl-autodie=2.10
    perl-autoloader=5.72 perl-autouse=1.07 perl-b-debug=1.17 perl-b-deparse=1.14 perl-b-lint=1.14
    perl-base=2.18 perl-bignum=0.29 perl-carp=1.26 perl-cgi=3.59 perl-compress-raw-bzip2=2.048
    perl-compress-raw-zlib=2.048 perl-constant=1.23 perl-cpan=1.9800 perl-cpan-meta=2.120630
    perl-cpan-meta-yaml=0.007 perl-cpanplus=0.9121 perl-cpanplus-dist-build=0.62 perl-data-dumper=2.135
    So it seems Shepherd is suddenly not finding it for some reason, after the upgrade.
    I tried installing the perl-data-dumper packages from the AUR (simple and concise) and neither of those made any difference.
    I also tried installing the module with a cpan command (as root) but that gave me errors (which I'd rather not have to deal with unless I have to to fix this issue).
    cpan
    Terminal does not support AddHistory.
    cpan shell -- CPAN exploration and modules installation (v1.9800)
    Enter 'h' for help.
    cpan[1]> install data:dumper
    Reading '/root/.cpan/sources/authors/01mailrc.txt.gz'
    sh: /bin/gzip: No such file or directory
    ............................................................................DONE
    Reading '/root/.cpan/sources/modules/02packages.details.txt.gz'
    sh: /bin/gzip: No such file or directory
    Warning: Your /root/.cpan/sources/modules/02packages.details.txt.gz does not contain a Line-Count header.
    Please check the validity of the index file by comparing it to more
    than one CPAN mirror. I'll continue but problems seem likely to
    happen.
    Warning: Your /root/.cpan/sources/modules/02packages.details.txt.gz does not contain a Last-Updated header.
    Please check the validity of the index file by comparing it to more
    than one CPAN mirror. I'll continue but problems seem likely to
    happen.
    Subroutine AUTOLOAD redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 31.
    Subroutine import redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 49.
    Subroutine tv_interval redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 70.
    .Could not split line["H\"rW{0M4K\cV1=YFH\cV\c^rqv\c?\c]hKUe\cGL)4:4\cH]C\cPW`_|#MU4}\c_\c_%\cQ\cN\cX8M}%e\cR|\cG-K\c?~\cAM_cREx#AnvAZDi\$\c\!h\cZ\cI^uG0 )>TQ}p{Q[Ah\cNIK\cV+R"]
    Could not split line["-6W/GQp\cOm`>UwBP[5"]
    Could not split line["\@\cPI\cW,(zY2]uA!fVegsf\\V\cGfL\cHY4ki\cFvLgY\\\c\Ed\cK\cTe-\cC0\@d!?\cSU/\\G8\cAt?(\cY.L\cZ\cBrtl~s[4\@\cF\cWu*D>I"]
    Could not split line["hb\cSKA1\cCjX(w/K\cA"]
    Giving up parsing your /root/.cpan/sources/modules/02packages.details.txt.gz, too many errorsReading '/root/.cpan/sources/authors/01mailrc.txt.gz'
    sh: /bin/gzip: No such file or directory
    ............................................................................DONE
    Reading '/root/.cpan/sources/modules/02packages.details.txt.gz'
    sh: /bin/gzip: No such file or directory
    Warning: Your /root/.cpan/sources/modules/02packages.details.txt.gz does not contain a Line-Count header.
    Please check the validity of the index file by comparing it to more
    than one CPAN mirror. I'll continue but problems seem likely to
    happen.
    Subroutine AUTOLOAD redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 31.
    Subroutine import redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 49.
    Subroutine tv_interval redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 70.
    Warning: Your /root/.cpan/sources/modules/02packages.details.txt.gz does not contain a Last-Updated header.
    Please check the validity of the index file by comparing it to more
    than one CPAN mirror. I'll continue but problems seem likely to
    happen.
    Subroutine AUTOLOAD redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 31.
    Subroutine import redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 49.
    Subroutine tv_interval redefined at /usr/lib/perl5/site_perl/Time/HiRes.pm line 70.
    .Could not split line["H\"rW{0M4K\cV1=YFH\cV\c^rqv\c?\c]hKUe\cGL)4:4\cH]C\cPW`_|#MU4}\c_\c_%\cQ\cN\cX8M}%e\cR|\cG-K\c?~\cAM_cREx#AnvAZDi\$\c\!h\cZ\cI^uG0 )>TQ}p{Q[Ah\cNIK\cV+R"]
    Could not split line["-6W/GQp\cOm`>UwBP[5"]
    Could not split line["\@\cPI\cW,(zY2]uA!fVegsf\\V\cGfL\cHY4ki\cFvLgY\\\c\Ed\cK\cTe-\cC0\@d!?\cSU/\\G8\cAt?(\cY.L\cZ\cBrtl~s[4\@\cF\cWu*D>I"]
    Could not split line["hb\cSKA1\cCjX(w/K\cA"]
    Giving up parsing your /root/.cpan/sources/modules/02packages.details.txt.gz, too many errorsTerminal does not support GetHistory.
    Lockfile removed.
    It says that the 02packages.details.txt.gz file is missing, but it looks like it's there -
    ls /root/.cpan/sources/modules/
    02packages.details.txt.gz 03modlist.data.gz
    I thought I might uninstall and reinstall perl but I got dependency errors and I'm not game to force an uninstall unless I get advice to do so -
    pacman -Rs perl
    checking dependencies...
    error: failed to prepare transaction (could not satisfy dependencies)
    :: automake: requires perl
    :: cdrkit: requires perl
    :: ddclient: requires perl
    :: git: requires perl>=5.14.0
    :: groff: requires perl
    :: gtk-doc: requires perl
    :: hspell: requires perl
    :: hyphen: requires perl
    :: imagemagick: requires perl
    :: lm_sensors: requires perl
    :: mod_perl: requires perl
    :: openssl: requires perl
    :: perl-algorithm-diff: requires perl
    :: perl-archive-zip: requires perl>=5.10.0
    :: perl-class-factory-util: requires perl
    :: perl-class-inspector: requires perl>=5.6.0
    :: perl-class-load: requires perl
    :: perl-class-methodmaker: requires perl>=5.10.0
    :: perl-class-singleton: requires perl>=5.10.0
    :: perl-convert-binhex: requires perl
    :: perl-data-dump: requires perl>=5.006
    :: perl-data-optlist: requires perl-scalar-list-utils
    :: perl-datetime-format-builder: requires perl
    :: perl-dbi: requires perl
    :: perl-digest-sha1: requires perl
    :: perl-email-date-format: requires perl
    :: perl-encode-locale: requires perl>=5.008
    :: perl-error: requires perl>=5.10.0
    :: perl-fcgi: requires perl
    :: perl-file-listing: requires perl>=5.8.8
    :: perl-file-slurp: requires perl>=5.14.0
    :: perl-font-afm: requires perl>=5.5.0
    :: perl-html-form: requires perl>=5.8.8
    :: perl-html-formattext: requires perl>=5.10.0
    :: perl-html-parser: requires perl>=5.12.1
    :: perl-html-tagset: requires perl>=5.10.0
    :: perl-html-tree: requires perl>=5.10.0
    :: perl-http-cache-transparent: requires perl>=5.10.0
    :: perl-http-cookies: requires perl>=5.8.8
    :: perl-http-daemon: requires perl>=5.8.8
    :: perl-http-date: requires perl>=5.8.8
    :: perl-http-message: requires perl>=5.8.8
    :: perl-http-negotiate: requires perl>=5.8.8
    :: perl-image-size: requires perl
    :: perl-io-socket-ssl: requires perl>=5.10.0
    :: perl-io-string: requires perl>=5.10.0
    :: perl-io-stringy: requires perl
    :: perl-libwww: requires perl>=5.8.1
    :: perl-lingua-en-numbers-ordinate: requires perl>=5.10.0
    :: perl-lingua-preferred: requires perl>=5.10.0
    :: perl-linux-pid: requires perl
    :: perl-list-moreutils: requires perl>=5.5.30
    :: perl-lwp-mediatypes: requires perl>=5.6.2
    :: perl-lwp-protocol-https: requires perl>=5.8.1
    :: perl-math-round: requires perl
    :: perl-mime-lite: requires perl
    :: perl-mime-types: requires perl
    :: perl-mozilla-ca: requires perl>=5.006
    :: perl-net-http: requires perl>=5.6.2
    :: perl-net-upnp: requires perl
    :: perl-package-stash: requires perl>=5.8.1
    :: perl-package-stash: requires perl-scalar-list-utils
    :: perl-package-stash-xs: requires perl>=5.8.1
    :: perl-params-classify: requires perl>=5.10.1
    :: perl-params-util: requires perl>=5.5.30
    :: perl-params-validate: requires perl
    :: perl-parse-recdescent: requires perl>=5.10.0
    :: perl-perl4-corelibs: requires perl>=5.006
    :: perl-soap-lite: requires perl
    :: perl-socket6: requires perl
    :: perl-sub-exporter: requires perl>=5.6.0
    :: perl-sub-install: requires perl
    :: perl-sub-uplevel: requires perl
    :: perl-term-readkey: requires perl
    :: perl-test-warn: requires perl>=5.10.0
    :: perl-text-iconv: requires perl
    :: perl-timedate: requires perl
    :: perl-tk: requires perl
    :: perl-unicode-string: requires perl
    :: perl-unicode-utf8simple: requires perl>=5.10.0
    :: perl-uri: requires perl>=5.10.0
    :: perl-www-robotrules: requires perl>=5.8.1
    :: perl-xml-dom: requires perl>=5.10.0
    :: perl-xml-namespacesupport: requires perl
    :: perl-xml-parser: requires perl
    :: perl-xml-regexp: requires perl
    :: perl-xml-sax: requires perl
    :: perl-xml-sax-base: requires perl
    :: perl-xml-simple: requires perl
    :: perl-xml-twig: requires perl
    :: perl-xml-writer: requires perl
    :: perl-xml-xpath: requires perl
    :: perl-yaml: requires perl>=5.10.0
    :: perl-yaml-syck: requires perl
    :: rsync: requires perl
    :: system-tools-backends: requires perl
    :: webmin: requires perl
    Any help would be greatly appreciated.
    Thanks
    belbo
    Last edited by belbo (2012-08-26 00:33:52)

    Thanks
    Results as follows -
    perl -MData::Dumper -e1
    Perl API version v5.14.0 of Data::Dumper does not match v5.16.0 at /usr/lib/perl5/site_perl/XSLoader.pm line 95.
    Compilation failed in require.
    BEGIN failed--compilation aborted.
    $ perl /mnt/Data/Scripts/Perl_Module_Check.pl wheremod Data::Dumper
    Data::Dumper /usr/lib/perl5/site_perl/Data/Dumper.pm
    Data::Dumper /usr/lib/perl5/core_perl/Data/Dumper.pm
    I ran the full delete command after backing up the directories (there was a lot in them)
    rm -rf /usr/lib/perl5/site_perl/* /usr/share/perl5/site_perl/*
    This resulted in a different missing module -
    $ sudo rm -rf /usr/lib/perl5/site_perl/* /usr/share/perl5/site_perl/*
    $ ~/.shepherd/shepherd --check
    ERROR:
    Mandatory module 'List::Compare' not found.
    I restored the backup and deleted only Data::Dumper but then got a missing XMLTV module -
    $ ~/.shepherd/shepherd --check
    $[ used in numeric lt (<) (did you mean $] ?) at /usr/share/perl5/site_perl/XML/Twig.pm line 7278.
    $[ used in numeric lt (<) (did you mean $] ?) at /usr/share/perl5/site_perl/XML/Twig.pm line 7284.
    $[ used in numeric lt (<) (did you mean $] ?) at /usr/share/perl5/site_perl/XML/Twig.pm line 7296.
    ERROR:
    Mandatory module 'XMLTV' not found.
    Following this post, I did the following, which returned no results -
    https://bbs.archlinux.org/viewtopic.php?pid=1110598
    $ sudo pacman -Qml | awk '/_perl\/auto\/.+\.so$/ { print $1 }' | uniq
    I then did the following -
    $ sudo rm -rf /usr/lib/perl5/*
    $ sudo pacman -S perl
    warning: perl-5.16.1-1 is up to date -- reinstalling
    resolving dependencies...
    looking for inter-conflicts...
    Targets (1): perl-5.16.1-1
    Total Installed Size: 48.33 MiB
    Net Upgrade Size: 0.00 MiB
    Proceed with installation? [Y/n]
    (1/1) checking package integrity [#######################################################] 100%
    (1/1) loading package files [#######################################################] 100%
    (1/1) checking for file conflicts [#######################################################] 100%
    (1/1) checking available disk space [#######################################################] 100%
    (1/1) upgrading perl [#######################################################] 100%
    $ ~/.shepherd/shepherd --check
    ERROR:
    Mandatory module 'XMLTV' not found.
    Oops. Then saw your next post saying not to do that as you'd lose the vendor modules, which I did. I had backed up so I restored, so still have the vendor stuff.
    In case its of any assistance, this is what my perl lib and share directories look like -
    [ben@htpc ~]$ ls /usr/lib/perl5
    core_perl site_perl site_perl.bak vendor_perl
    [ben@htpc ~]$ ls /usr/lib/perl5/site_perl
    Alien auto Data Encode encoding.pm Filter List Net perlfilter.pod SVN Term Unicode version.pm XML
    Attribute Compress Digest Encode.pm FCGI.pm filter-util.pl Math Params Storable.pm Sys Time version version.pod XSLoader.pm
    [ben@htpc ~]$ ls /usr/lib/perl5/vendor_perl
    Apache Attribute Crypt DateTimePP.pm dbixs_rev.pl GD.pm Image ModPerl Package SNMP.pm Text Unicode YAML
    Apache2 auto DateTime DBD Digest gv.pm JSON mod_perl2.pm Params Socket6.pm Tie Win32
    APR Bundle DateTime.pm DBI FCGI.pm gv.so Linux Net qd.pl SVN Tk Xfce4
    APR.pm Class DateTimePPExtra.pm DBI.pm GD HTML List NetSNMP RRDs.pm Term Tk.pm XML
    [ben@htpc ~]$ ls /usr/lib/perl5/core_perl
    arybase.pm Compress Cwd.pm DynaLoader.pm Fcntl.pm I18N List ODBM_File.pm perllocal.pod Socket.pm threads.pm
    attributes.pm Config_git.pl Data Encode File IO Math Opcode.pm POSIX.pm Storable.pm Tie
    auto Config_heavy.pl DB_File.pm Encode.pm Filter IO.pm MIME O.pm re.pm Sys Time
    B Config.pm Devel encoding.pm GDBM_File.pm IPC mro.pm ops.pm Scalar Text Unicode
    B.pm CORE Digest Errno.pm Hash lib.pm NDBM_File.pm PerlIO SDBM_File.pm threads
    [ben@htpc ~]$ ls /usr/share/perl5/
    core_perl site_perl vendor_perl
    [ben@htpc ~]$ ls /usr/share/perl5/core_perl
    AnyDBM_File.pm bignum.pm Config Digest.pm fields.pm IO Object PerlIO.pm Term utf8.pm
    App bigrat.pm constant.pm DirHandle.pm File IPC open.pm pod Test vars.pm
    Archive blib.pm CPAN Dumpvalue.pm FileCache.pm JSON overload Pod Test.pm version
    Attribute bytes_heavy.pl CPANPLUS dumpvar.pl FileHandle.pm less.pm overloading.pm Safe.pm Text Version
    autodie bytes.pm CPANPLUS.pm Encode filetest.pm Locale overload.pm Search Thread version.pm
    autodie.pm Carp CPAN.pm encoding Filter locale.pm Package SelectSaver.pm Thread.pm vmsish.pm
    AutoLoader.pm Carp.pm DBM_Filter English.pm FindBin.pm Log Params SelfLoader.pm Tie warnings
    AutoSplit.pm CGI DBM_Filter.pm Env.pm Getopt Math parent.pm sigtrap.pm Time warnings.pm
    autouse.pm CGI.pm DB.pm Exporter HTTP Memoize Parse sort.pm Unicode XSLoader.pm
    B _charnames.pm deprecate.pm Exporter.pm I18N Memoize.pm Perl strict.pm unicore
    base.pm charnames.pm Devel ExtUtils if.pm Module perl5db.pl subs.pm UNIVERSAL.pm
    Benchmark.pm Class diagnostics.pm Fatal.pm inc Net perlfaq.pm Symbol.pm User
    bigint.pm Compress Digest feature.pm integer.pm NEXT.pm PerlIO TAP utf8_heavy.pl
    [ben@htpc ~]$ ls /usr/share/perl5/site_perl
    Alien bigint.pm Bundle Class CPANPLUS DateTime ExtUtils HTML IPC Locale LWP.pm Math MythTV Object Try YAML
    Archive bignum.pm CGI Compress CPANPLUS.pm Encode File HTTP JSON LWP lwptut.pod Module MythTV.pm Test WWW YAML.pm
    Attribute bigrat.pm CGI.pm CPAN Date Env.pm Filter IO List lwpcook.pod Mail Mozilla Net Tie XML
    [ben@htpc ~]$ ls /usr/share/perl5/vendor_perl
    abbrev.pl bigrat.pl ctime.pl Error Font HTML LWP.pm open2.pl SOAP Time validate.pl
    Algorithm cacheout.pl Data Error.pm ftp.pl HTTP Mail open3.pl stat.pl timelocal.pl WWW
    Apache CGI Date exceptions.pl getcwd.pl Image Math Package Sub Tree XML
    Archive CGI.pm DateTime fastcwd.pl getopt.pl importenv.pl MIME Parse syslog.pl Try XMLRPC
    assert.pl chat2.pl Dist File getopts.pl IO Module Perl4 tainted.pl UDDI XMLTV
    auto Class dotsh.pl finddepth.pl Git Lingua Mozilla pwd.pl Term Unicode XMLTV.pm
    bigfloat.pl complete.pl Email find.pl Git.pm look.pl Net RRDp.pm termcap.pl URI YAML
    bigint.pl Convert Encode flush.pl hostname.pl LWP newgetopt.pl shellwords.pl Test URI.pm YAML.pm
    Any further assistance would be greatly appreciated.
    Thanks
    belbo
    Last edited by belbo (2012-08-27 13:40:14)

  • Data monitor app

    Does apple have a diagnostic app, mainly interested in data usage and what app/service usses my data that one can get acces to?
    For 3 times in the last 3 weeks I have used upp my 5GB of data (had to buy more) It is almost imposible to track it down sins I cant install wireshark or similar.
    an app like netlimiter with logging for windows would be perfekt. I suspect my former podcast app (podcaster) sins it is the only app that download large amount of data, but it should only do so over Wifi. Durign the last 2 days i have used Downcaster but i still hitt the limit (might have been close)
    I have mailed my provider sins the loggs thay provide on ther home page say that i use 99.9% of my data att midnight, so that must be all data that day. A also have WiFi at that time and during that day it only say that i use a few KB each h..
    I also noticed tat I only hade about 50% battery life left 4h after full charge. I listened to about 70min of audio podcast. My iphone 4 is 2 years old now and I have used it alot so it might not be extremly bad.

    If you launch the App Store... select Search at the bottom and then type into the search field Data Usage, you will see results for 63 different apps.
    Apple does not have data usage app, but it has some informaton in Settings > General > Usage > Cellular Usage
    This above section is only an estimate and should not be used to determine the amount of data you are using, you should check with your carrier for that information.

  • Understand "Department Script" in Employee Directory [Adobe AIR app]

    Ok, I ve been bashing my head around for long enough with this simple issue... I hope to find enough info to make good use of a great button in ED.
    The Department link. When viewing an employee's information in ED you can click the employee's department name and will be taken to a new panel that display all other employees in the same department. Wow, I like it....
    As much as it is good to see who else is in a department I wish I could simply use this as a static list because if I don't know the name of the employee in a specific department... well I just can't get there or take advantage of that list....
    So I went, as usual, and though I could just re-use the same script and create a static list...but, as usual, The app is way to tightly knot together that I can't seem to knock out it's relation to the employee....
    I though I would have to go back and find where it starts [IE-    dataProvider="{ department.employeees }"]   and use the dataProvider to populate my new list (in this case a datagrid)....
    Well it doesn't work, and my knowledge is too weak to figure it out as I still struggle with the syntax and it's meaning.... I think I am getting confused mostly by the  [    change="displayEmployee (event.target.selectedItem as Emlpoyee)     ] any part of script related to that change employee thingy just put out and I feel that must be the part that's linking the info to the employee at hand.
    I am out of ideas?!
    where the only way to trigger the department panel list is to open a employee from that department first...
    Here's what i am trying to do....
    Instead of clicking on the department link in the employee bio...i want to click on a button that would open a panel with a static list that never change, like the one i added at the bottom of the screenshot.
    Thank you
    Employee directory files can be found here  :
    http://download.macromedia.com/pub/developer/air/sample_apps/EmployeeDirectory.zip
    More information can be found at the original post that i made here [  http://forums.adobe.com/thread/545756?tstart=0  ]
    I created a new post as it is more clear and will be more usefull to more ppl if we find the solution.
    Thank you, again!

    Hey!
    Here's an example of how to get data from a database using Adobe Flex 3, AIR and SQLite.

  • Can I develop a data acquisition app without the DAQ card installed?

    I'm new to LabView.
    I would like to build a simple data acquisition app (graphical display and logging). I would prefer to build this on my desktop, and then install/run it on a laptop.
    I'm not sure how to work this as the DAQ card and driver are not installed on my desktop. Is there a (hopefully) simple way to configure a data source in labview so that the app will automatically (or at least easily) connect to the existing driver?
    At the end of the day I don't want to install the complete dev system on my cheap little laptop just for this one application.
    TIA,
    Michael

    yes you can develop an application without having a daq card installed. You still will use the same functions in labVIEW. If you are using labVIEW 6.1 they are under the data acquisition palette in the block diagram. If you are using LV7 there are now two types of data acquisition functions. Traditional NI Daq (before lv 7) and new Daq MX. The daq functions are under the NI Measurements palette in LV7. If you are new to LV I suggest you use the daq assistant on the bottom of the palette. This will allow you to easily configure what you want, the drawback is that you will need to use this on the computer that has the card. So try the traditionaly nidaq functions. They will allow you to develop without the card. hope this helps.
    BJD1613
    Lead Test Tools Development Engineer
    Philips Respironics
    Certified LV Architect / Instructor

  • Copy data between Apps filtered by  workstatus

    Hello all gurus,
    BPC 7.5 NW
    I am copying data between apps. I want to do it with abap code because I use BADI.
    I dont want to copy all data, only data that have been filtered by WorkStatus. I cant use DESTINATION_APP in Script Logic because I need to filter
    Structure app is the same. Origin app is APP_A and destination app is APP_B.
    I only copy data if Workstatus is 'Approved'
    App: APP_A                 App: APP_B
    Dim1 WS
    Dim2 WS
    DIm3 WS
    Dim4
    Dim5
    If workstatus(DIM1, DIM2, DIM3) = "Approved" then
            Copy Dim1,..,5 belongs to APP_A to APP_B
    else
                    Nothing
    end if
    Example:    2014,Actual,Account1,E1toN,Act1toN   copy to  2014,Actual,Account1,E1toN,Act1toN (same rows, same structure)
    Note: Act1toN= act1, act2, act3,...actN
    Then, my filter is
    SETAPP = WB
    APP_A = BHN
    Worstatus table--> /1CPMB/LKWSWBBHN
    Select statuscode from /1CPMB/LKWSWBBHN where DIM1=2104, DIM2=Actual,Dim3=Accoun1
    if statuscode='5' then  // 5 is Approved (statid) in UJW_STATCODE
         copy data  2014,Actual,Account1,E1toN,Act1toN belong APP_A  copy to  2014,Actual,Account1,E1toN,Act1toN belong APP_B(same rows, same estructure)
    end if
    if you need more information dont hesitate to ask
    thanks
    Gustavo

    The main How-To page: Enterprise Performance Management (EPM) How-to Guides - Business Process Expert - SCN Wiki
    Please read BADI related How-to for BPC NW 7 and 7.5 (BADI's for BPC 7 can be used in 7.5).
    Try this: How To Custom BADI for Rounding Off Values in SAP BUSINESSOBJECTS Planning and Consolidation, Version for SAP NetWeaver
    Also read badi sample in help.
    In order to write data to another cube you have to use write_back_int with the appropriate application selected.
    B.R. Vadim
    P.S. Read also here: Useful ABAP code in BPC 7.X NW version
    Message was edited by: Vadim Kalinin - P.S. Added

  • Regarding infotype 0580 (It's contain the prevoius year date )

    Dear Friends,
                              I am using the infotype 0580(Preveoius emplyee tax detail) but in that infotype to date field contain the wrong date it's contain the previous year date and also this to date field is disabled mode also we not able to change the date .It's gives the error End date precedes start date .
    Kindly guide me from where it taking the previous year date from the configuration .
    thanks
    sandeep

    Hi,
    The IT 0580, actually works on fiscal year, that is the reason its showing previous year date, let me know the emp joining in current fiscal year or previous fiscal year...
    If the emp joins in the Mid of fiscal year, the IT0580 shows "to date as disabled".
    please check the dates, its a game of dates only....
    Hope you can find the sol within the IT.
    - Ashish

  • Doe's the restore procedure from itune will return all my data and app ?

    Every day i backup my iphone4 via itune.
    If in the future the iphone will lost or get damage (i hope not) and i will buy a new  one, my question is:
    after restoring via itune,will i get the same app, the same data (i am writing a personal diary) the same contacts,SMS,MAIL SETUP etc...... ?
    I do not care from photoes and video because i save them to my computer.
    During the two years i bought app from the app store and i have downloaded alot of free apps.
    I have also delete alot of apps.
    That is why i am asking this question.
    If there will be an app that doing an image (like in pc) and i will get the same data it will be great.
    I will be very happy if some one can help me with my question.
    Shai.

    http://support.apple.com/kb/ht4946
    http://support.apple.com/kb/HT4859

  • Need a script to import the data from flat file

    Hi Friends,
    Any one have any scripts to import the data from flat files into oracle database(Linux OS). I have to automate the script for every 30min to check any flat files in Incoming directory process them with out user interaction.
    Thanks.
    Srini

    Here is my init.ora file
    # $Header: init.ora 06-aug-98.10:24:40 atsukerm Exp $
    # Copyright (c) 1991, 1997, 1998 by Oracle Corporation
    # NAME
    # init.ora
    # FUNCTION
    # NOTES
    # MODIFIED
    # atsukerm 08/06/98 - fix for 8.1.
    # hpiao 06/05/97 - fix for 803
    # glavash 05/12/97 - add oracle_trace_enable comment
    # hpiao 04/22/97 - remove ifile=, events=, etc.
    # alingelb 09/19/94 - remove vms-specific stuff
    # dpawson 07/07/93 - add more comments regarded archive start
    # maporter 10/29/92 - Add vms_sga_use_gblpagfile=TRUE
    # jloaiza 03/07/92 - change ALPHA to BETA
    # danderso 02/26/92 - change db_block_cache_protect to dbblock_cache_p
    # ghallmar 02/03/92 - db_directory -> db_domain
    # maporter 01/12/92 - merge changes from branch 1.8.308.1
    # maporter 12/21/91 - bug 76493: Add control_files parameter
    # wbridge 12/03/91 - use of %c in archive format is discouraged
    # ghallmar 12/02/91 - add global_names=true, db_directory=us.acme.com
    # thayes 11/27/91 - Change default for cache_clone
    # jloaiza 08/13/91 - merge changes from branch 1.7.100.1
    # jloaiza 07/31/91 - add debug stuff
    # rlim 04/29/91 - removal of char_is_varchar2
    # Bridge 03/12/91 - log_allocation no longer exists
    # Wijaya 02/05/91 - remove obsolete parameters
    # Example INIT.ORA file
    # This file is provided by Oracle Corporation to help you customize
    # your RDBMS installation for your site. Important system parameters
    # are discussed, and example settings given.
    # Some parameter settings are generic to any size installation.
    # For parameters that require different values in different size
    # installations, three scenarios have been provided: SMALL, MEDIUM
    # and LARGE. Any parameter that needs to be tuned according to
    # installation size will have three settings, each one commented
    # according to installation size.
    # Use the following table to approximate the SGA size needed for the
    # three scenarious provided in this file:
    # -------Installation/Database Size------
    # SMALL MEDIUM LARGE
    # Block 2K 4500K 6800K 17000K
    # Size 4K 5500K 8800K 21000K
    # To set up a database that multiple instances will be using, place
    # all instance-specific parameters in one file, and then have all
    # of these files point to a master file using the IFILE command.
    # This way, when you change a public
    # parameter, it will automatically change on all instances. This is
    # necessary, since all instances must run with the same value for many
    # parameters. For example, if you choose to use private rollback segments,
    # these must be specified in different files, but since all gc_*
    # parameters must be the same on all instances, they should be in one file.
    # INSTRUCTIONS: Edit this file and the other INIT files it calls for
    # your site, either by using the values provided here or by providing
    # your own. Then place an IFILE= line into each instance-specific
    # INIT file that points at this file.
    # NOTE: Parameter values suggested in this file are based on conservative
    # estimates for computer memory availability. You should adjust values upward
    # for modern machines.
    # You may also consider using Database Configuration Assistant tool (DBCA)
    # to create INIT file and to size your initial set of tablespaces based
    # on the user input.
    # replace DEFAULT with your database name
    db_name=DEFAULT
    db_files = 80 # SMALL
    # db_files = 400 # MEDIUM
    # db_files = 1500 # LARGE
    db_file_multiblock_read_count = 8 # SMALL
    # db_file_multiblock_read_count = 16 # MEDIUM
    # db_file_multiblock_read_count = 32 # LARGE
    db_block_buffers = 100 # SMALL
    # db_block_buffers = 550 # MEDIUM
    # db_block_buffers = 3200 # LARGE
    shared_pool_size = 3500000 # SMALL
    # shared_pool_size = 5000000 # MEDIUM
    # shared_pool_size = 9000000 # LARGE
    log_checkpoint_interval = 10000
    processes = 50 # SMALL
    # processes = 100 # MEDIUM
    # processes = 200 # LARGE
    parallel_max_servers = 5 # SMALL
    # parallel_max_servers = 4 x (number of CPUs) # MEDIUM
    # parallel_max_servers = 4 x (number of CPUs) # LARGE
    log_buffer = 32768 # SMALL
    # log_buffer = 32768 # MEDIUM
    # log_buffer = 163840 # LARGE
    # audit_trail = true # if you want auditing
    # timed_statistics = true # if you want timed statistics
    max_dump_file_size = 10240 # limit trace file size to 5 Meg each
    # Uncommenting the line below will cause automatic archiving if archiving has
    # been enabled using ALTER DATABASE ARCHIVELOG.
    # log_archive_start = true
    # log_archive_dest = disk$rdbms:[oracle.archive]
    # log_archive_format = "T%TS%S.ARC"
    # If using private rollback segments, place lines of the following
    # form in each of your instance-specific init.ora files:
    # rollback_segments = (name1, name2)
    # If using public rollback segments, define how many
    # rollback segments each instance will pick up, using the formula
    # # of rollback segments = transactions / transactions_per_rollback_segment
    # In this example each instance will grab 40/5 = 8:
    # transactions = 40
    # transactions_per_rollback_segment = 5
    # Global Naming -- enforce that a dblink has same name as the db it connects to
    global_names = TRUE
    # Edit and uncomment the following line to provide the suffix that will be
    # appended to the db_name parameter (separated with a dot) and stored as the
    # global database name when a database is created. If your site uses
    # Internet Domain names for e-mail, then the part of your e-mail address after
    # the '@' is a good candidate for this parameter value.
    # db_domain = us.acme.com      # global database name is db_name.db_domain
    # FOR DEVELOPMENT ONLY, ALWAYS TRY TO USE SYSTEM BACKING STORE
    # vms_sga_use_gblpagfil = TRUE
    # FOR BETA RELEASE ONLY. Enable debugging modes. Note that these can
    # adversely affect performance. On some non-VMS ports the db_block_cache_*
    # debugging modes have a severe effect on performance.
    #_db_block_cache_protect = true # memory protect buffers
    #event = "10210 trace name context forever, level 2" # data block checking
    #event = "10211 trace name context forever, level 2" # index block checking
    #event = "10235 trace name context forever, level 1" # memory heap checking
    #event = "10049 trace name context forever, level 2" # memory protect cursors
    # define parallel server (multi-instance) parameters
    #ifile = ora_system:initps.ora
    # define two control files by default
    control_files = (ora_control1, ora_control2)
    # Uncomment the following line if you wish to enable the Oracle Trace product
    # to trace server activity. This enables scheduling of server collections
    # from the Oracle Enterprise Manager Console.
    # Also, if the oracle_trace_collection_name parameter is non-null,
    # every session will write to the named collection, as well as enabling you
    # to schedule future collections from the console.
    # oracle_trace_enable = TRUE
    # Uncomment the following line, if you want to use some of the new 8.1
    # features. Please remember that using them may require some downgrade
    # actions if you later decide to move back to 8.0.
    #compatible = 8.1.0
    Thanks.
    Srini

  • Day One script errors on Compiling and Viewing App

    Day one script errors on Compiling a
    nd Viewing App. Video will not work

    Well today is working, makes me look foolish, not the first time.
    When I had trouble the error message was that the script had script errors. When I dismissed or ignored the video never started.
    That's ok I have enjoyed the tutorial so far, best I have used.
    Before i tried to learn flex 3 and gave up trying to use a database. My daughter wants me to build an e-commerce site for her homemade jewelry, so maybe it can be done in flash 4. Dreamweaver says you can connect to a business model that let's you build an e-commerce site, but I think it has not come out yet.
    Date: Tue, 13 Apr 2010 10:21:40 -0600
    From: [email protected]
    To: [email protected]
    Subject: Day One script errors on Compiling and Viewing App
    Hi RyderMac,
    I just clicked on the video link for http://www.adobe.com/devnet/flex/videotraining/xml/fiaw_v1_05.html
    and the video and audio is working fine for me.  Are you getting any kind of error message?
    >

  • Settings:General:Date & Time: Year date is 2554BE

    Hi, can anyone help me.
    The Problem: Settings:General:Date & Time: Year date is 2554BE - the year date selection will only give me as choice of about 50 years at 2554 BE - when I spin the wheels to select 2011 - the year dates are ghosted and the wheels automatically flip back to 1st January 2554 BE.
    This problem is affecting my date, times in Email and Calendar and some APPS where I am trying to book accommodation, flights etc.
    I noticed this after an update syncing session late Nov, early Dec. I didn't understand what the ramifications would be until I have started to use my travel apps for booking things.
    I have tried syncing my IPhone 3 to Itunes in the hope it would fix it, but it hasn't to date.
    regards
    Dianne

    It sounds like you are using the wrong calendar. Go to Settings, General, International and make sure the Calendar setting is Gregorian.

  • Can configurator/profile manager group apps

    Appologies if this post is in the wrong section. 
    I have been asked to set up profile manager in such a way that the on-site admin can not only wirelessly deploy apps to devices, but also group and re-group those apps together on the devices screen over-the-air.  End result should be that end users do not have to wade through a sea of apps just to find one application (users include a large number of very young children, so it has to be easy to direct them to specific apps). 
    Now I'm aware that this could be achieved by making a back-up of a pre-configured device, however, this doesn't work well in terms of maintaining the iPads configuration since any new apps added would fall outside of the grouping and the devices would need to be returned to the server room to get their configurations updated with a new backup image and would defeat the purpose of using Profile Manager.  I have attempted to convince the on-site admin that they should assign the devices to gropus in profile manager so that Group A only get the apps they use, Group B only get the apps they use, and so on the really isn't keen on the idea because there would still be way too many apps (we're talking 50 or so per device) to wade through just to find the one the class will use. 
    Is this possible or will it at some point in the future be possible?  The advances made by this new version of profile manager and the VPP store is fantastic, but it'd be nice if this feature could be added at some point in future. 
    Kindly
    Stom

    If I remember it correctly the user (Apple ID) get a 30 days grace period during which they can migrate data from the app and continue use it. When that period ends they get the opportunity to buy the app for themselves and continue using it for their own money.

  • HT1918 I need to know how to delete my credit card from account because I know longer have a credit.  I just want to be able to up date my Apps but I can't now because there is a credit card on my account what do I need to delete it.

    I need to know how to delete my credit card from account because I know longer have a credit.  I just want to be able to up date my Apps but I can't now because there is a credit card on my account what do I need to delete it.

    I need to know how to delete my credit card from account because I know longer have a credit.  I just want to be able to up date my Apps but I can't now because there is a credit card on my account what do I need to delete it. Also my credit has expired so i need to delete it.

Maybe you are looking for

  • Bugs with 10.4 (80), 6 and counting...

    I've downloaded the last iTunes version (I´m running Leopard 10.5.8), and since I've found 6 annoying bugs: 1. Already discussed in here, "Drag & Drop" feature is not working anymore, at least for me to create new playlists. With previous versions I

  • RF menu doesn't fit screen

    Hi folks, I'm observing a strange behavior. I use a PDA with a barcode scanner and display the standard SAP ITSMobile menu (lm00) in the Internet Explorer of that Windows CE device. I've set the presentation format to 16x20. The PDA has a QVGA displa

  • Function Module's for Shopping Cart.

    Hi all, I want to know which is the function module to know item details and for status of shopping cart in BBP_PD. Also, can anybody please guide me how to use those in simple reports. Thanks in advance. Regards! Rahul

  • Core dump in _ti_mutex_unlock

    Greetings, I am using compiler (Sun C++ 5.5 2003/03/12 on a Sun/SunOS 5.8 system and I am getting the following traceback. Does anybody know what pacth I need to install to get around this issue? I though I had installed all the specified patches for

  • PI setting for transferring Contract from SRM to ECC - Classic scenario

    Dear Experts SRM 702 ECC 6.0 EHP6 SRM - Classic scenario Need to understand PI setting for transferring Contract from SRM to ECC, as currently document Central Contract document getting created in SRM, but it is not getting transfer to ECC No XML mes