Method Limits? a stressed newbie

pls can anyone tell me if i can have the following in a method?
(WHOLE CLASS)
class SuperMethod
     public static double aTaxRate (double currentSalary)
          if (currentSalary < 6000)
               currentSalary = currentSalary * 0.15;
               return (currentSalary);
          else if ((currentSalary >=6001)|(currentSalary <= 15000))
               currentSalary = currentSalary * 0.25;
               return (currentSalary);
          else if (currentSalary >= 15001)
               currentSalary = currentSalary * 0.45;
               return(currentSalary);
     public static void main (String[]args)
          double currentSalaryInput;
          System.out.println("Howdy !! please enter the feckin current salary");
          currentSalaryInput = EasyIn.getDouble();
          System.out.println(aTaxRate(currentSalaryInput));
i use EasyIn to get the integer value, basically i need to evaluate the salary rate and then output the new salary based on 3 different tax rates 15%, 25% and 45%, but it keeps outputting an error on line 4!!
much thanks

You have an error in your code here...
else if ((currentSalary >=6001)|(currentSalary <= 15000))
replace the | with ||, | means bitwise OR, || is logical OR which is probably what you want here :)

Similar Messages

  • Input Method Limitation on BB10 (only 3)

    Hello, I love the physical keyboard and the OS of my Q10 but it has a huge problem (as least for me): it won't let me select more than 3 input languages (such as French, English, Chinese...). I need to select to select at least five. Android, IOS or WP don't have that limitation (the last time i used a phone with a similar limitation it was a Symbian^3). I was wondering if this will be modify in a future version of the OS? Do you guys have some information about that? Thanks, Pierre

    The default setting is that only Tabs that contain information are shown, maybe your recording was too short to produce any method samples?
    You can force the JRMC GUI to display all tabs through Window->Preferences->JRockit Mission Control->JRockit Runtime Analyzer and select the "Show every tab..." checkbox.
    If that doesn't work, send a copy of the recording to jrockit-improve AT bea and we'll take a look. Or better yet, submit a bug report to bugzilla.bea.com.
    And if you haven't done so already, take a look at the new JRMC update site for JRMC Eclipse Plug-Ins: https://dev2devclub.bea.com/updates/eclipse-3.3/jrmc/.
    -- Henrik

  • Dimension bulding reference methods limitations

    is there any limitations on dimension building reference methods, i am finding very difficult to choose which reference method to use and when and why ?
    can anyone please help me in this faq.

    when you are building dimensions or dataloading through generation reference method, if you can't find the dimension name in the dimension build settings or data load
    settings whaat is the procedure do you follow to build members to that dimension We will never do dataloading thru generation referance.
    If any additional referances are there in the datafile other than the dimensions in the outline, simply we will skip them with option "Ignore Field for Data loading".
    If dimension is not defined before developing rule file for dimension building, we will define that in the rule file dimension defnition tab.
    See the below thread to have more idea about this.
    Re: dimension building

  • F110 one invoice amount into diiferent cheques due to payment method limits

    hi experts,
    i had a problem while running app by using f110,we are using the cheque payment method, maximum amount limit authorisation for cheque Payment Method is 5,00,000/-, but at the time of app if invoice amount  exceeds  payment method it is showing exception list.
    kindly suggest me how to generate for one invoice mutiple chques
    eg: invoice amount:7,00,000/-
    Limit for  single cheque: 5,00,000/-
    how i can get two cheques i.e. 5 lakhs & 2 Lakhs while running app f110
    please give me  soluation
    rgds

    hi,
    thanks for your reply, actually the problem is while running appf110   invoice payment amount exceedes cheque payment method & it's showing as exception list
    in my case my invoice amount is 7,00,000/-
    we are using cheque payment method & Per cheque limit is 5,00,000/- if exceeds cheque limit amount more than 5,00,000/- banker will return the cheque, we have to issue the more cheques for single invoice. in my caseinvoice value isr 7,00,000/- so i have to issue the cheque like 5,00,000/- & 2,00,000/- through Appf110
    kindly suggest me how can i achive this process
    rgds
    gopal

  • Limiting decimal places

    is there a way to limit the number of decimal places that appear in an output?

    Yes, truncate it manually or by some input method limiter.
    If you are reading a spreadsheet file and outputting that to the console, then you can simply check the value of the string, and determine how many numbers come after the "." charactor. That will give you an index in the string.
    Using that index you can then simply create a new string including the whole number, decimal, and only those points that you want.
    You can also do the reverse and expand the decimals from, let's say 1 decimal (like 10.2) to a 2-digit decimal (like 10.20) by simply adding the needed places in the string after the last charactor.

  • Brand new mac pro crashing. Software or hardware problem?

    Hi!
    First of all, I just wanna say - help from as many as possible would be highly appreciated, as I've talked to apple service center etc. No one knows what the problem is.
    Alright, here we go. I purchased a Mac Pro with 2x500GB HD, 4GB Ram and 2x2,66 Dual Core Intel Xeon processors just two weeks ago. Up until now, it has worked flawlessly.
    The first thing I did, was I imported my home folder from the .mac software Backup, and it worked fine. Everything was okay for a week or so.
    Then I used the application Visual Hub to convert some video files into iTunes, and it worked fine. Along with this, I downloaded stuff with Tomato Torrent - four or five downloads, couple of gigs, but not too much.
    And then the mystery started. I had the following applications open; Pages, Mail, Safari, iTunes, Visual Hub, Tomato Torrent and iCal. All of a sudden, iTunes crashed, saying it couldn't save the session or something, and my music library xml file got damaged. This happened quite a few times, and I had to re-import all the songs and movies and make new playlists.
    Pages said it couldn't save to disk, and Mail said the startdisk was full (it had over 350gb of free space).
    I forced a shutdown and when I restarted the machine, the blinking folder with a question mark showed up. I re-started it again, and it found the startdisk, and I got into MacOsx. However, no applications would open and if I tried to copy anything or empty the trash, the dialog box marked files as DW and NE instead of the file names.
    So I restarted from the install disc. It couldn't activate the main HD when I tried Disk Tools, and found lots of faults on the main HD that couldn't be repaired. After 5 or 6 restarts it finally found it, and I was able to re-install Mac OsX.
    After that, it seemed okay. Everything worked as it should, but the machine is working very slow, and when opening files and folders I sometimes get the wheel of death. It hangs in almost every application and when I try to move around a finder window for example, it jumps around in sharp movements instead of gliding.
    Disk tools can't find any problems on the harddrives anymore. Everything seems okay, but I feel that it's not.
    Apple says it might be the harddrive or the memory.
    I really do not want to send them the machine, cause I really depend on it. My question is - could this be a hardware or software problem?
    If it is a software problem, does anybody know what it can be?

    I agree that it might be the hard drive or memory.
    Run Apple Hardware Test.
    Get some backup drives going, and use Disk Warrior and SuperDuper.
    Did you import from a G4/5 computer some applications? don't. Install only.
    Were any of the drives or memory 3rd party or all Apple?
    When you installed the system again, did you erase? you should. And especially because it came up with errors and problems. You NEED 3rd party disk repair (Disk Warrior is the most reliable and preferred).
    It sounds like Visual Hub or Tomato threw a monkey wrench into the gears.
    And at this point your drive and file system needs to start all over, fresh, from scratch, and backup your data files, and then reformat.
    The only way to know is to physically test and do the things that a "mechanic" does, with more tools and experience with tools and problems.
    iTunes can push the limits and stress RAM, so it is a good way to find marginal or poor RAM. So is Memtest. But it takes awhile, but can be run 16 copies at once to test 256MB chunks for 4-5 cycles. No guarantee.
    Zero your drives. It helps insure they are working. Or put them in a PC or run XP/Vista on your Mac Pro and use vendor tools (LifeGuard, Seatools, etc) to map out blocks or zero the drive)
    It also ounds like the drive's permissions were trashed in the process and the ownership locked you out. Or something. WhatSize and OmniDiskSweeper and good. Or Tomato turned something on/off or into soup or sauce.
    Never continue to use a system after freezes and crashes without taking some precautions, maintenance, booting from an emergency 2nd drive.
    Use FSCK in single user mode.
    Do a Safe Boo.
    run Applejack to delete all the cache files along with any swap files (can't hurt, some don't see the need).
    Make a list, Rule something out and move on. But you will have to start with a freshly erased drive or partition, fresh OS, and then gradually test and add, one thing at a time.
    Lastly, when you do install the OS, and have made all the APPLE updates, make a backup.
    Create a disk image as well (sparse) on another drive/s. Setup one as "emergency only" and never add or alter or use it for anything other than running updates to Apple, disk repair stuff, testing the main drive.
    If you get to having a stable system, back it up as a clone of your working system. Put it away (Firewire is ideal).
    Updating to 10.4.11 or something ?
    backup first so you have a working copy. Only add or update programs AFTER you have a backup. Takes longer but you can 'fall back' and have a recovery plan.
    Installing an application that you are unsure of or might cause trouble? even an OS update can cause trouble, and there is no easy way yet on OS X to "back out" or uninstall or revert to an earlier point or driver or version. The only way I know of it with backups and more backups.
    Which is why I have JUST the OS on its own drive, just have to deal with applications and the system that way, and all the data and personal files are safe but also backed up.
    Rather than use online backup, get some backup disk drives.
    That may or should be "common sense" approach to installing and updating. Takes longer. Protects you and your data.
    When you had to re-import your data again and again, you were already walking on thin ice, about to go. Or had gone and just hadn't hit yet.

  • How to have the library updated between two computers?

    I have two computers. One is running Windows XP(always on) with iTunes and the whole library in it. I share this iTune library wirelessly so that my MacBook Pro can access it. The problem is - whenever I update my library from one of this computers, it's not updated while accessing from another. Is there a script or something to help me with this problem?
    Thanks.

    if you want to try this
    http://www.macosxhints.com/article.php?story=200704240
    81346722
    Almost perfect, except that this method limits me to have one iTune runnning at a time. My Windows XP's iTune is always on, always running, serving as a streaming source for my Apple TV. My MacBook Pro shares its iTune Library and uses with Airport Express in another room. I almost never do anything on my Windows PC, so I'd like to manage all the stuff from MBP.

  • Virtual cube reporting logic

    I have created report directly out of BCS transactional Totals cube and is giving correct output.
    But why does SAP suggest to make BCS reports out of virtual cube, rather than BCS totals cube? Is it more helpful for reporting if COI logic is implemented?
    What is the  distinct difference reason for creating reports with Virtual cube Vs Transactional cube.
    What is the pros and cons of creating report out of BCS Transactional cube vs virtual cube.?
    Inputs are appreciated.

    Hi Christopher,
    In general, you are right.
    But, in practice... The restrictions that you mentioned may be very easily avoided. As I wrote here:
    COI Equity method...
    it's not even necessary to have associated companies in the Group's hierarchy. Hence, at equity method limitation doesn't work in our situation.
    The main point here is a general consolidation logic. And the virtual cube performs it perfectly.
    Of course, we can devise some situation which the vcube will not be able to handle. Like running the report for a period when a daughter company was already sold or become an associate (drastical org changes).

  • Bug: no constraints/foreign keys when Scripter.Prefetch = true

    Kind of bug report, using assemblies version v11.0.0.0 and ScriptingOptions.DriAll = true.
    When you have tables with foreign keys or other constraints, these are not scripted when you pass urns of all tables to Scripter.Script which has Prefetch property set to true. I believe this is due to a bug in prefetching.
    To workaround, set Scripter.Prefetch = false. Strange enough, passing urns of subset of the tables works as well (even excluding one table only), which results in the "constraints are returned when passing single urn only" reported behavior.

    Hi, sure, sorry for late reply.
    This is the DB create script:
    DROP DATABASE [TestDB]
    USE [master]
    GO
    CREATE DATABASE [TestDB] CONTAINMENT = NONE ON PRIMARY
    ( NAME = N'TestDB', FILENAME = N'C:\Windows\Temp\Test.mdf' , SIZE = 5120KB , MAXSIZE = UNLIMITED, FILEGROWTH = 1024KB ) LOG ON
    ( NAME = N'TestLG', FILENAME = N'C:\Windows\Temp\Test.ldf' , SIZE = 1024KB , MAXSIZE = 2048GB , FILEGROWTH = 10%)
    GO
    ALTER DATABASE [TestDB] SET COMPATIBILITY_LEVEL = 110
    GO
    ALTER DATABASE [TestDB] SET ANSI_NULL_DEFAULT OFF
    ALTER DATABASE [TestDB] SET ANSI_NULLS OFF
    ALTER DATABASE [TestDB] SET ANSI_PADDING OFF
    ALTER DATABASE [TestDB] SET ANSI_WARNINGS OFF
    ALTER DATABASE [TestDB] SET ARITHABORT OFF
    ALTER DATABASE [TestDB] SET AUTO_CLOSE OFF
    ALTER DATABASE [TestDB] SET AUTO_CREATE_STATISTICS ON
    ALTER DATABASE [TestDB] SET AUTO_SHRINK OFF
    ALTER DATABASE [TestDB] SET AUTO_UPDATE_STATISTICS ON
    ALTER DATABASE [TestDB] SET CURSOR_CLOSE_ON_COMMIT OFF
    ALTER DATABASE [TestDB] SET CURSOR_DEFAULT GLOBAL
    ALTER DATABASE [TestDB] SET CONCAT_NULL_YIELDS_NULL OFF
    ALTER DATABASE [TestDB] SET NUMERIC_ROUNDABORT OFF
    ALTER DATABASE [TestDB] SET QUOTED_IDENTIFIER OFF
    ALTER DATABASE [TestDB] SET RECURSIVE_TRIGGERS OFF
    ALTER DATABASE [TestDB] SET DISABLE_BROKER
    ALTER DATABASE [TestDB] SET AUTO_UPDATE_STATISTICS_ASYNC OFF
    ALTER DATABASE [TestDB] SET DATE_CORRELATION_OPTIMIZATION OFF
    ALTER DATABASE [TestDB] SET TRUSTWORTHY OFF
    ALTER DATABASE [TestDB] SET ALLOW_SNAPSHOT_ISOLATION OFF
    ALTER DATABASE [TestDB] SET PARAMETERIZATION SIMPLE
    ALTER DATABASE [TestDB] SET READ_COMMITTED_SNAPSHOT OFF
    ALTER DATABASE [TestDB] SET HONOR_BROKER_PRIORITY OFF
    ALTER DATABASE [TestDB] SET RECOVERY SIMPLE
    ALTER DATABASE [TestDB] SET MULTI_USER
    ALTER DATABASE [TestDB] SET PAGE_VERIFY CHECKSUM
    ALTER DATABASE [TestDB] SET DB_CHAINING OFF
    ALTER DATABASE [TestDB] SET FILESTREAM( NON_TRANSACTED_ACCESS = OFF )
    ALTER DATABASE [TestDB] SET TARGET_RECOVERY_TIME = 0 SECONDS
    ALTER DATABASE [TestDB] SET READ_WRITE
    GO
    USE [TestDB]
    GO
    CREATE TABLE [dbo].[KeySource] (
    [ID] [smallint] IDENTITY(1,1) NOT NULL,
    CONSTRAINT [PK_Reservation] PRIMARY KEY CLUSTERED ([ID] ASC)
    WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
    ) ON [PRIMARY]
    CREATE TABLE [dbo].[KeyUse] (
    [UseID] [smallint] NOT NULL,
    [Row] [tinyint] NOT NULL,
    [Column] [tinyint] NOT NULL,
    CONSTRAINT [PK_KeySource] PRIMARY KEY CLUSTERED ([UseID] ASC, [Row] ASC, [Column] ASC)
    WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY],
    CONSTRAINT [IX_ReservationSeats] UNIQUE NONCLUSTERED ([Row] ASC, [Column] ASC)
    WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
    ) ON [PRIMARY]
    GO
    ALTER TABLE [dbo].[KeyUse] WITH CHECK ADD CONSTRAINT [FK_Source_Use] FOREIGN KEY([UseID]) REFERENCES [dbo].[KeySource] ([ID])
    ON UPDATE CASCADE
    ON DELETE CASCADE
    ALTER TABLE [dbo].[KeyUse] CHECK CONSTRAINT [FK_Source_Use]
    ALTER TABLE [dbo].[KeyUse] WITH CHECK ADD CONSTRAINT [FK_Use_Use] FOREIGN KEY([UseID], [Row], [Column]) REFERENCES [dbo].[KeyUse] ([UseID], [Row], [Column])
    ALTER TABLE [dbo].[KeyUse] CHECK CONSTRAINT [FK_Use_Use]
    This is the code:
    ServerConnection serverConnection = new ServerConnection("server", "user", "pwd");
    Server server = new Server(serverConnection);
    Database database = server.Databases["TestDB"];
    Urn[] urns = database.EnumObjects(DatabaseObjectTypes.Table).Rows.OfType<DataRow>().Select(r => new Urn((string)r["Urn"])).Take(2).ToArray();
    Scripter s = new Scripter(server);
    s.Options.DriAll = true;
    StringCollection result = s.Script(urns);
    This is the result:
    SET ANSI_NULLS ON
    SET QUOTED_IDENTIFIER ON
    CREATE TABLE [dbo].[KeySource](
    [ID] [smallint] IDENTITY(1,1) NOT NULL
    ) ON [PRIMARY]
    SET ANSI_NULLS ON
    SET QUOTED_IDENTIFIER ON
    CREATE TABLE [dbo].[KeyUse](
    [UseID] [smallint] NOT NULL,
    [Row] [tinyint] NOT NULL,
    [Column] [tinyint] NOT NULL
    ) ON [PRIMARY]
    Note that the constraints are missing.
    As already noted, adding s.PrefetchObjects = false; line fixes this.
    But let's do the magic way - supplying less Urns than there actually is tables. To test this, add a dummy table to the DB:
    USE [TestDB]
    CREATE TABLE [dbo].[DummyTable]([Test] [nchar](10) NULL) ON [PRIMARY]
    Run the code again. Note that the Take(2) method limits the Urns supplied to the Scripter to the same set as before.
    However, the result is now:
    SET ANSI_NULLS ON
    SET QUOTED_IDENTIFIER ON
    CREATE TABLE [dbo].[KeySource](
    [ID] [smallint] IDENTITY(1,1) NOT NULL,
    CONSTRAINT [PK_Reservation] PRIMARY KEY CLUSTERED
    [ID] ASC
    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
    ) ON [PRIMARY]
    SET ANSI_NULLS ON
    SET QUOTED_IDENTIFIER ON
    CREATE TABLE [dbo].[KeyUse](
    [UseID] [smallint] NOT NULL,
    [Row] [tinyint] NOT NULL,
    [Column] [tinyint] NOT NULL,
    CONSTRAINT [PK_KeySource] PRIMARY KEY CLUSTERED
    [UseID] ASC,
    [Row] ASC,
    [Column] ASC
    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY],
    CONSTRAINT [IX_ReservationSeats] UNIQUE NONCLUSTERED
    [Row] ASC,
    [Column] ASC
    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
    ) ON [PRIMARY]
    ALTER TABLE [dbo].[KeyUse] WITH CHECK ADD CONSTRAINT [FK_Source_Use] FOREIGN KEY([UseID])
    REFERENCES [dbo].[KeySource] ([ID])
    ON UPDATE CASCADE
    ON DELETE CASCADE
    ALTER TABLE [dbo].[KeyUse] CHECK CONSTRAINT [FK_Source_Use]
    ALTER TABLE [dbo].[KeyUse] WITH CHECK ADD CONSTRAINT [FK_Use_Use] FOREIGN KEY([UseID], [Row], [Column])
    REFERENCES [dbo].[KeyUse] ([UseID], [Row], [Column])
    ALTER TABLE [dbo].[KeyUse] CHECK CONSTRAINT [FK_Use_Use]
    By testing on larger database I have noticed it does not really matter which table is excluded from the list.
    Hope this helps,
    Jan

  • ATP/GATP documents

    Hello all,
    Do you guys have documents that show all the process about GATP and ATP? Weu2019ll start an APO implementation Project in few monthsu2026 and until there Iu2019d like to clarify theses differencesu2026
    Thanks in advance

    Hello Bruno,
    I'm not sure if a documentation exists, but I can describe you the differences shortly:
    ATP in R/3
    ATP check just inside of one R/3 system
    DB Tables or shared buffer (ATP Server)
    Standard check methods
    limitations exist to combine the check methods
    simple substitution of products
    simple Backorderprocessing
    Globale ATP
    cross system ATP check possible
    using of liveCache Technology
    Extended Check Methods (e.g. trigger PPDS to confirm a order)
    no limitations to combine the check methods      
    rule based substitution of products, locations, PPMs
    extended Backorderprocessing           
    As you can see the GATP has a lot of more options to check and to confirm a requirement. For further details about a single point of GATP please check to SAP Help
    Link: [http://help.sap.com/saphelp_scm50/helpdata/en/26/c2d63b18bc7e7fe10000000a114084/frameset.htm]
    Hope this helps.
    best regards,
    Michael

  • Concurrent readsocketconnections

    Iam having a similar problem as described by Bill Kline of a December 99 newsletter, (weblogic.developer.interest.performance) relating to the number of concurrent connections within Web Logic Server.
    I am using a compuware tool (QALoad) to drive an 8x Windows 2000 Advanced Server platform with Web Login 5.1 and JVM 1.2.2. No matter how many virtual users I spawn (1-100) from the load test tool, I can only get 16 threads into my application server daemon and thus am inhibited from loading down the machine (about 28 % busy).
    WebLogic documentation indicates the default for the number of readsocketconnections is twice the number of cpu's (2*#numcpu's), and I am suspecting this parameter maybe the bottleneck in the path limiting my stress on the 8x platform. I've been reading docs and searching the NT registery for this "connection" parameter but cannot find it anywhere. I don't even know the name of the parameter to search for it. Can you advise or help in anyway? I would appreciated it very much.
    If you cannot help can you point me to the right person for assistance?
    Dave Hiller
    Global Industry - Transportation
    Unisys
    [email protected]
    W651-687-2805
    F651-687-2368

    Dave
    Are you running on a 8-way Unisys ES5085. If you are, you could also
    consider keeping executeThreadCount at 15 and starting up additional WLS
    instances. Your appl may get into more serialization inside the JVM at high
    executeThreadCount. Of course, it depends on your appl. Benchmarks will help
    you determine your sweet spot.
    "Dave Hiller" <[email protected]> wrote in message
    news:3be1a86b$[email protected]..
    Iam having a similar problem as described by Bill Kline of a December 99newsletter, (weblogic.developer.interest.performance) relating to the number
    of concurrent connections within Web Logic Server.
    >
    I am using a compuware tool (QALoad) to drive an 8x Windows 2000 AdvancedServer platform with Web Login 5.1 and JVM 1.2.2. No matter how many
    virtual users I spawn (1-100) from the load test tool, I can only get 16
    threads into my application server daemon and thus am inhibited from loading
    down the machine (about 28 % busy).
    >
    WebLogic documentation indicates the default for the number ofreadsocketconnections is twice the number of cpu's (2*#numcpu's), and I am
    suspecting this parameter maybe the bottleneck in the path limiting my
    stress on the 8x platform. I've been reading docs and searching the NT
    registery for this "connection" parameter but cannot find it anywhere. I
    don't even know the name of the parameter to search for it. Can you advise
    or help in anyway? I would appreciated it very much.
    >
    If you cannot help can you point me to the right person for assistance?
    Dave Hiller
    Global Industry - Transportation
    Unisys
    [email protected]
    W651-687-2805
    F651-687-2368

  • Keep last frame of a clip until the next clip in the timeline?

    Is there an easy method to keep displaying the last frame of a clip until the next clip shows up in the timeline. I do not want black video in between.
    Video demo:
    http://www.bartelsmedia.com/vid/tmp_premiere_question2.swf
    I know that one can mess around with the playback speed of a clip but this makes it hard to exactly hit the beginning of the next clip.
    Any ideas?
    Ciao, Gunnar

    Yeah, Hunt. Frame hold certainly has room for improvement. It also suffers from the auto-deinterlacing bug wherein the held frame is (often?) deinterlaced whether you've ticked "deinterlace" or not, so it is totally unusable for me.
    The workaround of exporting .BMPs is wretched also, as this forces a YUV->RGB conversion (for DV, HDV, etc.) that clamps the color range.
    I don't like the rate changing method either, as there is a risk of inaccuracy -- especially if you need to tweak the ins/outs afterwards.
    When I know I need to do lots of frame holds (such as with instructional animation, screen captures, etc.) I make sure I build in at least an extra second of no-mo where I plan to hold the frame. Then I copy/paste 1 second clips to fill the time. This sucks also, but it is the only way I have found around the other methods' limitations.
    My PPro "frame hold fantasy vision" (based on my hazy recollection of my old NLE ca. 1993) is something like: Right-click, choose "Frame Hold at CTI" -- which would automatically set Marker 0 at the CTI, frame hold at Marker 0, and then allow you to infinitely pull the ends of the clip. The overextended regions should be indicated with grey diagonals on the clip in the timeline, just like when transitions run out of frames or nested sequences get shortened, etc.

  • GL approval button is disabled

    Hello gus,
    I am trying to approve a GL Journal, but the approval button is disabled. The two profiles:Journals: Allow Preparer Approval and Journals: Find Approver Method; limits and supervisor are all setup accordingly.
    Please let me know what I ahve missed to enable the button .
    Thanks

    Please ignore this query . I have now resolved the issue.
    Regards

  • SQL Server Express Performance Limitations With OGC Methods on Geometry Instances

    I will front load my question.  Specifically, I am wondering if any of the feature restrictions with SQL Server Express cause performance limitations/reductions with OGC methods on geometry instances, e.g., STIntersects?  I have spent time reading
    various documents about the different editions of SQL Server, including the Features Supported by the Editions of SQL Server 2014, but nothing is jumping out at me.  The
    limited information on spatial features in the aforementioned document implies spatial is the same across all editions.  I am hoping this is wrong.
    The situation....  I have roughly 200,000 tax parcels within 175 taxing districts.  As part of a consistency check between what is stored in tax records for taxing district and what is identified spatially, I set up a basic point-in-polygon query
    to identify the taxing district spatially and then count the number of parcels within in taxing district.  Surprisingly, the query took 66 minutes to run.  As I pointed out, this is being run on a test machine with SQL Server Express.
    Some specifics....  I wrote the query a few different ways and compared the execution plans, and the optimizer always choose the same plan, which is good I guess since it means it is doing its job.  The execution plans show a 'Clustered Index Seek
    (Spatial)' being used and only costing 1%.  Coming in at 75% cost is a Filter, which appears to be connected to the STIntersects predicate.  I brute forced alternate execution plans using HINTS, but they only turned out worse, which I guess is also
    good since it means the optimizer did choose a good plan.  I experimented some with changing the spatial index parameters, but the impact of the options I tried was never that much.  I ended up going with "Geometry Auto Grid" with 16 cells
    per object.
    So, why do I think 66 minutes is excessive?  The reason is that I loaded the same data sets into PostgreSQL/PostGIS, used a default spatial index, and the same query ran in 5 minutes.  Same machine, same data, SQL Server Express is 13x slower than
    PostgreSQL.  That is why I think 66 minutes is excessive.
    Our organization is mostly an Oracle and SQL Server shop.  Since more of my background and experience are with MS databases, I prefer to work with SQL Server.  I really do want to understand what is happening here.  Is there something I can
    do different to get more performance out of SQL Server?  Does spatial run slower on Express versus Standard or Enterprise?  Given I did so little tuning in PostgreSQL, I still can't understand the results I am seeing.
    I may or may not be able to strip the data down enough to be able to send it to someone.

    Tessalating the polygons (tax districts) is the answer!
    Since my use of SQL Server Express was brought up as possibly contributing to the slow runtime, the first thing I did was download an evaluation version of Enterprise Edition.  The runtime on Enterprise Edition dropped from 66 minutes to 57.5 minutes.
     A reduction of 13% isn't anything to scoff at, but total runtime was still 11x longer than in PostgreSQL.  Although Enterprise Edition had 4 cores available to it, it never really spun up more than 1 when executing the query, so it doesn't seem
    to have been parallelizing the query much, if at all.
    You asked about polygon complexity.  Overall, a majority are fairly simple but there are some complex ones with one really complex polygon.  Using the complexity index discussed in the reference thread, the tax districts had an average complexity
    of 4.6 and a median of 2.7.  One polygon had a complexity index of 120, which was skewing the average, as well as increasing the runtime I suspect.  Below is a complexity index breakdown:
    Index
    NUM_TAX_DIST
    1
    6
    <2
    49
    <3
    44
    <4
    23
    <5
    11
    <6
    9
    <7
    9
    <8
    4
    <9
    1
    <10
    4
    >=10
    14
    Before trying tessellation, I tweaked the spatial indexes in several different ways, but the runtimes never changed by more than a minute or two.  I reset the spatial indexes to "geometry auto grid @ 32" and tried out your tessellation functions
    using the default of 5000 vertices.  Total runtime 2.3 minutes, a 96% reduction and twice as fast as PostgresSQL!  Now that is more what I was expecting before i started.
    I tried using different thresholds, 3,000 and 10,000 vertices but the runtimes were slightly slower, 3.5 and 3.3 minutes respectively.  A threshold of 5000 definitely seems to be a sweet spot for the dataset I am using.  As the thread you referenced
    discussed, SQL Server spatial functions like STIntersect appear to be sensitive to the number of vertices of polygons.
    After reading your comment, it reminded me of some discussions with Esri staff about ArcGIS doing the same thing in certain circumstances, but I didn't go as far as thinking to apply it here.  So, thanks for the suggestion and code from another post.
     Once I realized the SRID was hard coded to 0 in tvf_QuarterPolygon, I was able to update the code to set it to the same as the input shape, and then everything came together nicely.

  • Newbie: Method should or should not have side effects

    Hi experts,
    What does it really mean when I read for the InputVerifier class that the method 'shouldYieldFocus' can have side effects but the method 'verify' should not have side effects.
    Thanks for you comments.
    tuckie

    I am but a newbie only asked to learn and maintain. The reason I ask about side effects is that the shouldYieldFocus() method is invoked twice for the same tab key event. When the tab key (or mouse click) wants to move focus to another input the current input's shouldYieldFocus() is invoked, it in turn invokes verify() which validates data returning true or false and checks to see if a warning should be issued that the data is legal but high. If the data is not high it also returns true and the focus is yielded. Also shouldYieldFocus() is only invoked once. It is when the data is high and the showConfirmDialog() is put up that I get the second shouldYieldFocus() invocation. The previous coder put in a de-bouncing mechanism and I think this is where/how the problems with the next field are created. Sometimes the next field's focusLost() is invoked without the operator making any input. The focusLost() does some fill in the blank things that are reasonable only if the operator really wanted not to fill in any data in the field.
    Back to my original point, I was wondering if the fact that the verify() method may have a dialog box put up before it returns to shouldYieldFocus() is the kind of thing that shouldn't be done - no side effects. If so then it could be the likely cause of the problem with the next field sometimes being automatically filled in as if it had received a focusLost() event.
    tuckie

Maybe you are looking for

  • Converting color .jpg image to grayscale

    I have a color .jpg image that I want to convert to grayscale and then add annotation. Is this easy to do in Illustrator?

  • Representing a sequence of moves

    I'm working on a board game project and I'm having trouble understanding how to store the moves for the game called Go or Igo. I've already tried using linked lists below. Each stone has a list of next move stones. I have a stone manager that keeps t

  • Remotely trigger tests in OTM?

    Hi there, All my searching is just returning results regarding database triggers, so I'll just ask here... Is it possible to remotely trigger test execution in Oracle Test Manager? I know I can schedule tests for a specific time/dates/etc., but I was

  • Nokia 6111 camera flash wont turn off

    Hello, can someone help, ive got a nokia 6111 and the flash wont turn off. I've went into the camera function and turned the flash off but its still on! its very annoying can someone help?

  • View of Description of Document in Different Language

    Hi Experts, We maintain the description of the Documents in several languages using the transaction CV01N or CV02N. But when we tried to find all the different language descriptions, we could not find any way of cusotmising. Can anybody have come acr