Focus On The Future

Extracted from DPReview - http://www.dpreview.com/news/0511/05112206refocuscamera.asp
Ren Ng, graduate student at Stanford University has developed a hand-held plenoptic camera which takes a shot first and allows you to make the decision about focus point in software after the event. The prototype camera is actually a Contax 645 with a modified Megavision FB4040 back (sixteen megapixel). The back has had an array of 90,000 microlenses mounted in front of the sensor (with a gap between the array and the sensor). These microlenses create a unique image on the sensor surface which includes not only the amount of light deposited at that location, but how much light arrives along each ray. The image is then reconstructed in software and a focus point can be chosen. Note that the final resolution is the same as the number of microlenses.

I am fed up of these stupid non-constructive battles by these giant corporations. They could fix the issue in no time but they would rather make us struggle, us who pays for their products, rather then make their platforms interoperable.
This is sick.
Yvan

Similar Messages

  • The future of Novell (Micro Focus Question/Concern)

    I've been a loyal and happy Novell engineer for many many years (like most here I'd imagine).
    While there has been many changes and retreats the last several years, at least we always had the core to fall back on.
    My concern is that with the latest press release from Micro Focus: Novell (as a brand) is going away.
    Is it just me or does the latest press release seem to say everything is going to be re branded as Micro Focus?
    "The product portfolios of Micro Focus, Borland, NetIQ, Attachmate and Novell will be brought together under the one portfolio of Micro Focus. Each product will have a defined strategy created and managed by a global product group."
    So that means the Novell brand is going away?
    On one hand, they bought it = they can do anything they want. For the price I'm sure they want to build their own brand.
    On the other hand, I've been a loyal Novell engineer for decades..... I have no interest in being a Micro Focus engineer. I don't know who they are, I don't know what they're about, I don't know how steadfast they are with the product, if they will increase engineering or decrease it. I don't know if they will milk what's left of Novell products until they cease to exist.
    The Attachmate ownership was a disaster (IMO), the only thing that saved Novell was the outstanding Dev teams that really put a lot of work (and heart) into OES and GW.
    I have just started a new software development business and was happily looking forward to making it a proper Novell shop. Now I have to hesitate, because if the "Big Red N" is going away....... I'll slip away with it and have to make the nasty choice of being a Red Hat shop (which I have avoided up until now).

    On Tue, 28 Apr 2015 18:46:01 GMT, mathew35
    <[email protected]> wrote:
    >
    >I've been a loyal and happy Novell engineer for many many years (like
    >most here I'd imagine).
    >While there has been many changes and retreats the last several years,
    >at least we always had the core to fall back on.
    >
    >My concern is that with the latest press release from Micro Focus:
    >Novell (as a brand) is going away.
    >
    >Is it just me or does the latest press release seem to say everything is
    >going to be re branded as Micro Focus?
    >
    >"The product portfolios of Micro Focus, Borland, NetIQ, Attachmate and
    >Novell will be brought together under the one portfolio of Micro Focus.
    >Each product will have a defined strategy created and managed by a
    >global product group."
    >
    >
    >So that means the Novell brand is going away?
    >On one hand, they bought it = they can do anything they want. For the
    >price I'm sure they want to build their own brand.
    >On the other hand, I've been a loyal Novell engineer for decades..... I
    >have no interest in being a Micro Focus engineer. I don't know who they
    >are, I don't know what they're about, I don't know how steadfast they
    >are with the product, if they will increase engineering or decrease it.
    >I don't know if they will milk what's left of Novell products until they
    >cease to exist.
    >The Attachmate ownership was a disaster (IMO), the only thing that saved
    >Novell was the outstanding Dev teams that really put a lot of work (and
    >heart) into OES and GW.
    >
    >I have just started a new software development business and was happily
    >looking forward to making it a proper Novell shop. Now I have to
    >hesitate, because if the "Big Red N" is going away....... I'll slip away
    >with it and have to make the nasty choice of being a Red Hat shop (which
    >I have avoided up until now).
    It sounds to me like things are going to be rebranded. Whether that
    is a good thing or not, only time can tell. While I like the Novell
    brand, some people don't, so maybe a branding change will breathe some
    new life into marketing. But at this point, I'm not concerned because
    the people I deal with at Novell are still there and taking care of me
    and the roadmaps look good. I'm going to keep plugging away and see
    how it all pans out.
    Ken

  • Rather than persists in the future thread...

    The point I was making would be for Archs documents to pick up where man pages leave off, but whatever.
    If you think I stated Arch must fix man pages, not so. I was intending to show that docs as a whole, leave whole subjects unhandled, Arch could provide the answers in its own docs and come out as a hero.
    Or maybe not.
    And remember, users who cannot configure ppp cannot get on the internet to get the ppp configuration docs. Or any other docs, for that matter. One failed configuration needing web documents might leave the newbie stranded.
    Or maybe not.
    I personally think that a lot of configuration has to be made before Linux (Arch or others) will communicate over the internet to a remote website, so as to allow documentation to be visible..... but hey, maybe I missed something? Like the document to get a modem setup? So I could get web based documents? Chicken and egg?
    Maybe not, eh?
    As for the concept of needing to be an experienced admin - these folks usually do not need any docs. That compares to bad docs to a newbie - essentially no docs. Missing docs and no internet access are also useless. Thus, the newbie without documents needs the only thing which gets anyone else through the issue - the newbie needs experience, because the docs are weak or missing or erroneous. Try as you might, that is an issue I went through. Take, for example, when I needed network assistance. Archs forums were not the first place I went asking for help, believe me; I asked and am still asking. 2 years now, and I'm just getting told Samba. Need I say this is baloney? That issue is not Archs fault, but Arch could have a smashing success on its hands if it were to document a few critical areas as well as 0.2 was documented.
    Installing and partitioning... What can I say... apeiro is not mentioning anything I've PM'd him about: secondary hard disks getting reformatted. My pm brought me the answer that most Archers never repartition, so the script was.... well, I guess you could say it was a shot in the dark. Fine, but the PM stated the alternative was something like 'I usually use the commandline stuff and simply reformat the existing partitions'.
    Why the **** would an inexperienced newbie set up partitions outside of using the installer?? The Arch docs tells the newbie to use the installer to partition their hard disks! What, exactly, is the newbies idea of an installers purpose? Are you a newbie when you decide what a newbie has for skills? The installer obliterated gigabytes of data, I trusted that it was going to perform safely, and yet nobody ever used it to learn of the glitch? Ooookkaaaaayyyy.
    Folks, the docs said to use the installer. Never mentioned possibly partitioning from the command line. Take your pick, the newbie was not at fault for following the docs.
    I'll run along, because I cannot possibly be speaking truth. Neither am I telling you where I had problems. This is cannot be feedback, it must be dunbars rantings. Does Arch have a future when feedback is considered trolling or flaming the developers? Lighten up, folks, I had once been trying to offer help and sought to get help.
    I know this is a labor of love for you all; in many ways, Linux is a labor of mine as well. Sadly, you are not hearing some issue that could help make Arch the distro that some folks want it to be.

    dunbar wrote:Step by step: I mentioned a few weak areas that newbies would not know how to address.
    I mentioned that newbie would not necessarily have the skills to get the correct docs from a myriad of websites.
    Which appear (and frequently are) obsolete.
    Well, then we include the docs and then non-newbies would complain that we are cluttering the filesytem and you would complain because those documents are obsolete. sound familiar? i have seen this argument before and there is not winning it. people bitch about the manpages, docs, etc ALL THE TIME. you must know that. so the best we can do is add your concerns to our own documents and add necessary documents to the files.
    When their internet connection was what had failed, the internet doc is not available.
    And when they had no Xwindows in which to read HTML documents,
    and nothing tells the newbie about CLI browsers.
    i would expect that no absolute green newbies would try arch so those newbies that do show up i would assume have some linux experience (one can be a newbie and have some experience) so i would assume that they would know about lynx or links browsers or at least know how to use cat and less.
    another point about this is that if a newbie does not even know how to use cat, less, or a cli editor to view files then they would be very very f**ked with arch linux and having self contained documents would be pretty pointless. so again i would ask is it not wise to print out/write down any relevent install/configuration instructions? you do know how to write since you already say you don't have a printer?
    okay so if we include install configuration data in the installer will you know how to access it? is it fair to tell the user that it is in such and such directory and you can access it through say vc1-4? should we scrap our online docs altogether and just have them self contained?
    The network thing was changed along the way - IPX/SPX was part of the issue, my 5 port switch is another part. I am not berating you, sarah31, nor am I asking anyone to re-write man pages, etc. as several posts seemed to attribute to me. The man pages are the weak spot, Arch docs need to go from there. Never wanted to say more than that.
    and all i wanted to know is what apps do we extend our documentation to? all of them? are you saying that all our apps need extended docs? can you point me to one distro that has extended documents of all their applications? besides ppp what would YOU extend? it is fair to ask this of arch developers/users but they really need to know just what manpages are defunct and have no bearing on arch.
    i would not expect a request like this would get fulfilled too quickly there are lots of applications' man pages and docs to go through then the developers would have to determine what is needed and what is not.
    as for manpages......i used to find them unreliable but to be honest most manpages i have viewed lately have been very clear and concise about how to operate/configure apps. i am not saying this to contradict you but i say it honestly. when i have been unable to find someone to ask on this forum or irc manpages normally do the trick for me. (and don't say it is because i am some expert with linux because i a definitely not)
    I'm trying to point out that A] with a presumed goal of gaining Linux users,
    B] Arch, being small, uncluttered and likely attractive to newbies (small draws newbies because it is so small (dialup accessible but only one big download at a time) and older systems where XP won't fit on their disks, etc) and also
    C] since Arch had very tight documents covering what needed to be done (but might need more topics covered), I felt that was likely going to lead to newbies arriving here in some situations that
    D] Arch clearly tries to assume will not need the attentions of forum members (which have the skills) and
    E] the newbie does not have the skills.
    That would lead to the assumption that Arch was not interested in the users needs.
    with respect to this.....well what can i say but you are a complete ***hole. for one thing small does not always attract newbies in fact i know ALOT ALOT ALOT of people that will not even try arch or similar distros because they are small. most people want the choice to kludge up their system if they like but arch is not in this realm yet because we do not offer alot of packages.
    i agree we need more stuff covered in our documents and you would not find a single user or developer that would disagree with you. So i guess this point you make throws out your idea of extending the manpages because that is not a concise project (for example manpages for transcode are good but to get into all the basics a newbie would need to know would require one to be alot more verbose similarily with networking documents)?
    to say that we here do not pay attention to users is ABSOLUTELY THE WRONG ANSWER. there are only two pages of unanswered post and considering there are always post that are merely statements that do not need to be answered that is very good. in fact if you even bothered to check there are only 79 unanswered posts out of 4512. that's 1.75% of the posts on the forum are unanswered. besides that there are always questions that no one has an answer to. obscure problems do exist i know i have hit many in my ventures in any OS.
    besides this forums are a free service. no one is under obligation to use it user AND developers alike. no one is paid here so don't diss anyone and don't feel that anyone is obligated to answer you.
    i can also say that you are a complete arrogant ass for saying arch does not care about its users. we don't care about you that is for certain. but i can guarantee that everything i did every package i made, upgraded, donated, fixed, etc was for the user. people wanted openoffice i spent a week building it then it got broken with the upgrade to gcc 3.3 i spent another week trying to fix it without luck then judd spent 3 days compiling and patching 1.1. so do you EVER say we don't fucking care you little ingrate.
    The perceived 'lack of clarity' on my part is because I am still a newbie. I might have a few things working under Slackware, but I'm not certain, today, where I even made the changes. I'm not asking for hints on how to keep a notebook, I have one, it is 40 miles away, I cannot discuss Linux by reading my notes or grepping my config files, they are 40 miles away. I cannot log into my Linux PC, I only have dial up, the only telephone line in my house is for voice communications. DSL is too far away, Cable is too expensive.... have I never said any of this before? No, not in this thread, but I've always been a hardliner on those points sarah31. I do not match up with what is assumed of me. But here I am, posting, despite my ineptitude.
    "Waaagh i'm a newbie. waggh i don't have highspeed pity me pity meee!"
    there are lots of users on dialup here including developers so we don't give a flying f**k.
    I believe the deepest undercurrent I see here is diverging viewpoints of what Arch is and diverging goals for Arch. My views are different from a few who assume the Archer has the skills, hApy seems to have a third viewpoint, and yet a third viewpoint exists.
    yeah and you are saying we have to conform to your view. typical. funny we seem to be getting more and more users all the time both newbie and non newbie and this despite having a completely uncaring development team and a horrendous set of docs. one of the funny things is that many of the newbies recently are all dialup like you and they still take time to make irc interesting or contribute packages.
    If my posts reflect an atmosphere of bewildering viewpoiints, I'm not surprised - I've tried to reply to differing posts which take differing opinions; I suppose it is frustrating to anyone to reply to 3 posts at once. I'm replying in order to offer my view of when I was a frustrated newbie (this morning, I think ;-) ). Remember, I was told by a certain forum member that, most certainly, dunbar was a slackware user and the assumption was that he must be nearly expert - yet, I declared, no, I'm not an expert and I freely admit a lack of skills. I was not the one who estimated dunbars skills, Sarah31.
    oh you are soooo subtle in your insults. come on you tell me after hundreds of installs and two years of using linux you don't know anything? find me five green newbies that know how to grep or know to look online for information. personally i and many others here and elsewhere have little time for someone who cries about being a newbie when they obviously aren't.
    what is it some sort of ploy to make people feel sorry for you or shorten your look online for info that you likely could find in two minutes? spare me. up to one year you are a n00b after that you are not.
    I'm definitely not interested in dissing anyone, not you, nor Gyroplast, nor Apeiro, nor hApy .....
    hmmm you care to stand by that or should i pull out several quotes to the contrary?
    I have pointed out that early on, Arch was interested in being Judds perfect distro... did anyone ask him if he ever said that? As I said, I can offer to cut and paste, if you wish.
    so your point is? is this a bad thing? is it not possible for many of us to believe it is the perfect distro for us as well? why not diss yoper for claiming to be THE distro.
    Yet, I'm clearly getting 2 viewpoints in response to me repeating that fact. Thus, in order to push the reader away from their position, so as to get them to walk in my shoes, I post from different directions aiming toward one central condition. Is the issue me, and me alone? Or did the issue exist, and I'm guilty of responding to all viewpoints and thus I'm guilty of pointing it out?
    i have been in your shoes and i often get put back in your shoes when i start to use or investigate applications or areas of linux i have never explored before. so whats your point? if it was to fix up the documentation? that was a goal for some time now in fact a few users have made it an ongoing thing.
    you are not the only one that is a newbie here nor are you the only one here. there are many things that developers must balance when they do their duties. there will never be a distro that will satisfy all a user's needs but i can tell you that the arch team does try to please as many people as possible. so yeah i think you are the issue to some extent. you could have come in here and politely explained what it is that you felt needed improvement but instead you came in and whine and cried that you were so abused and that we had to change to please you. it was all about you anyone can easily extract that from the way you keep flipping between i'm a newbie, i'm not a newbie but i speak for newbies. if you cannot find the offesive comments you have made along the way or see how some people came to the conclusions they did then it is your fault.
    Anyway, you mentioned the ethernet thread.... thydian is evidently new here. Lets explain that issue out loud (since you already raised the matter). I offered to write a document regarding Ethernet setup, I was not shown what to do (frozdsolid posted that they were "pretty sure that's necessary"). I do not assume that every member here reads every posted message; thus, I believe I had read the opinion of someone who was as newbie to Arch as I myself was a newbie (frozdsolids title still shows only 4 posts even today). Nobody here 'handed me an answer' as some might conclude from your post.
    well YOU may not have gotten answer but people DID try to answer your question and, in fact, the answer is there. but you could not extract it or did not know how to ask the question properly to get the result you desired.
    but of course we are the bad guys here because we took the time to try and give you an answer. man you are such a wanker....
    I posted everywhere else on the internet about my situation using IPX/SPX, I heard that it was a dead protocol. I finally found information about IPX/SPX, it was not herein, so the forum was of no use in that instance - that is a fact, sarah31. Now that I have concluded that IPX/SPX was not the best choice and changed the rest of the household over to TCP/IP...... the IPX/SPX issue is no longer the focus, so I dropped the subject; until I was  6 months later, I came back here, saw BluPhoenixs post and was a bit confused. That lead to him suggesting DOSemu, I stated no, that was not preferred, etc. etc. I ultimately thanked BluPhoenix, stated why I was going to drop the issue, and I left the thread cold.
    so why insist on blaming us for something that we tried to answer but was obviously beyond our knowledge? you did it at the end of that thread and you constantly do it here. how many other people did you verbally assault along the way?
    The whole thing got misdirected, away from what I was asking for, as if the topic was no longer my decision alone.
    what a pile of BS. it was YOUR thread so get in there and assert yourself. threads get out of hand sometimes but the original poster can easily get control again if they have a pair.
    i know for a fact that the people he chose to insult would and have tried to help him but he blows them off. i know too that the head developer is VERY open to user contributions yet dunbar chose not to contribute.
    The reasons should be evident by now - when I offered to contribute, I had the time; 6 months of time transpired, I was not able to write because I had no answers with which to generate such a document. I had to revoke the offer. I am taken by surprise that anyone would say I was given the necessary information!
    BS again you just stated that you got your answer (outside of our forum) so you could have easily posted back with what you had found out and then provided documentation later. and you mean to tell me that you have not had time in the last six months to wing something together. shit you have practically written a novel here.
    it is obvious to me that you just want to guard that knowledge and us it to flame and troll here. once you had the answers you were sure to come back and flame that thread and continue flaming on a regular basis. what an ass.
    I've known you to be patient, sarah31 (and you are yet teaching me as I write), but when you say I'm taking great pains to insult someone - while I'm waiting months for forum responses and I'm reading internet documents that are obsolete and these are docs which talk about a different distro, refer to a different kernel, puts files in a different location????
    yeah so you waited for an answer and didn't get one...it happens. you stated that barely anyone knew the answer. fianlly you got one and then came back and rudely blamed us for poor documentation and a barrel of other things. nice guy.
    all i saw was orelein answer you in a nice and proper fashion and you called him eliteist. you also were rather rude about judd in your first post in the future thread. so yes i see all throughout attempts to belittle and brate and not one instance of sober commentary from you.
    and here you are again balming us for online docs that are not ours.
    That is not appropriate, ever, to assume that the newbie will not find older docs and will know enought to discard the incorrect ones.
    this is not limited to newbies.....it can definitely be difficult finding what you need online.
    And since most internet docs are coasting along since, for example, 1999 (re: the IPX/SPX how-to)
    well if you are checking out and obscure problem that is actually now obsolete then sure the fucking doc will be old but you make is sound like ALL docs are old. so i have to assume that you are very much an idiot because i have found most documentation for most current issues to have current docs. most applications will upgrade their docs as they upgrade or do you even notice that? are you to self absorbed to go around and find out if your wild accusations actually have any merit?
    once again, and I'll ask this, and directly of you, sarah31, why would any newbie assume that 4 year old document applied to their situation??
    well knowing how many newbies are i would expect them to ask if docs are relevant. or they could possibly look into some of the information. if it was not producing answers...wow i think they would ask for help again. shit do you even pay attention to how newbies act on justlinux?
    :oops: I'd ask that people remove useless web documents, but I fear that I'd get only 4 responses. That is reality, not sarcasm.
    hmmm remove docs, add docs which is it? fyi arch removes most docs except in rare circumstances. if those docs are html they are html. if a n00b doesn't know how to view them i expect they would ask( that is if they are outside x).
    dunbar...stick with slack because arch will never please you. slack is a very nice distro that should have the balance arch does not afford you. that is the great thing about a linux .... if you don't like one flavour then try another. just don't go back to the ice cream dealer and berate him for selling you your choice...get the point (if you don't then fine i expected that)

  • Advice on using Flash in the future for Android VS. Air VS. Web Development

    First, I am not trolling and I see that other similar questions have been asked recently, though I am looking for specific info to meet the needs of our program (that is why I am posting this now).
    We've been teaching Flash at a community college for many years. After Adobe stopped development of the Flash Player for browsers, I'm left wondering what the future of Flash in the web will be? Data shows that a majority of web browsing will take place from mobile devices over the next several years, and if you are not going to be able to view Flash web content in a browser from a mobile device, I am concerned about whether it makes sense to continue teaching Flash to students for web design/development. I know Flash can also be used for developing applications, Air, etc., and we do some of that currently, I'm just wondering how Adobe can expect developers to commit to the platform for web development without supporting mobile web browsers. Am I missing something? Do developers see this differently? I appreciate any insight from Flash developers or Adobe employees. FYI - I just did a similar analysis with our video editing program and asked similar questions in an Apple Final Cut Pro forum and we are moving away from Apple's Final Cut Pro after they changed direction with their latest release and we are now going to start teaching Adobe Premiere. Should we focus teaching Flash for Android and Air development and back off from using it for web development?
    Thanks in advance.

    I'd like an answer for this one too.  Thank you, "Media Kat" for posting the question.  I am a prototyper with a background in art and design, who has worked exclusively with Flash for nearly 13 years.  I'm currently working with AIR for Android, but my superiors and associates think AIR is going to die (along with the rest of Flash) and don't understand that simply "switching to HTML5" is not a very palatable answer.  It would be helpful to hear from someone "in the know" about the future of Flash.
    Thanks.
    P.S.  If someone from Adobe answers this, please don't BS me like they did at the last forum I attended at Adobe headquarters in San Jose.  We asked specifically if Flash and/oir Flex were dying and were assured that they had a long future... Three weeks before the announcement to kill it!  :-(

  • Role of fuji and osgi related things in the future

    hi *,
    i am having difficulties for seeing the big picture what use cases technologies like Fuji / Osgi could have for openEdb and CAPS 6.X in the future. can someon tell me his idease where it could be best used? local deployments?
    what is the target of these things.
    regards chris

    Maybe the possibility of a network of low-memory-footprint app servers that are still centrally managed ? It'll give many of the possibilities of the pre-ICAN versions.
    For example, you can then package and deploy a large array of such modular app servers and deploy them at stores, for example, to both address real-time traffic and scheduled data synchronization.
    Or you can deploy a little server consisting of an embedded JavaMQ, JMS adapter and Oracle eWay on a system that suffers when too many queries are done over the network, so it's easier to build up a little stack of JMS messages that can be processed at the target system's convenience.
    The whole evolution with JEE6, JSE7, the modularity of GlassFish v3 with HK2 are already focused on being able to have low-footprint servers. Because they are so light-weight they are quick to start and stop and it's easy to implement a little load balanced "server farm" with e.g. little servers dealing with HTTP requests, authentication and forwarding them onto a Message Queue and all this with only a couple of MB for the classes, and maybe something like 128 MB heap, or likely less. It's small size and the component-based model would give a startup time of say 3 seconds.
    And then OSGi adds more to that... imagine parts of CAPS6 on a Blackberry.. just using a micro-implementation of OpenMQ for dealing with emails, rss feeds, and whatever subscriptions one might prefer. Not that it can't be done with another tool, but this does make it an lot easier to e.g. de-allocate email and rss subscriptions to a small personal segment on a central server, and the client only has to synchronize.
    See where this is going ? An appserver model that allows for both large monolithic implementations and a network of small 'agents'.. As Feynman said, "There's plenty of room at the bottom"..

  • What will compliment HANA in the future...

    I'm in a little dilemma. My team was going for training and get on projects for HANA. But it seems that has been postponed for alteast a year. It was disappointing to hear this.
    I'm a BI developer, working with basic ETL, Infocubes, DSO, etc. (only 1 year experience, was ABAP developer before)
    I do have somewhat of a flexibility to learn something new (I'm still new to BI, so I have that option).
    I want to learn something that will compliment HANA when I get involved in it in the future. I know HANA will be replacing many things in BI. There aren't that many options (BusinessObjects, BWA, BEX, stick with BI (ETL/DSO/InfoCubes/etc.).
    What would be the best application/product to learn in BI now that will definitely help/compliment when learning HANA in the future? (which will also improve my skill set and value as a consultant also)

    Just a general remark on this, as I believe many customers are thinking about the best ways to adopt HANA.
    With BW 7.3 SPS 5 it's possible to run BW natively on a HANA system. This includes changed implementations for different key features of SAP BW (e.g. flat infocubes, in-memory DSO, exception aggregation and de-aggregation pushed to the HANA level etc.).
    So it's possible to take the existing BW warehouse and reporting system and run it on HANA.
    Step by step the adoption of the new HANA features can then be done.
    And, often overlooked, HANA is not a data warehouse. It doesn't integrate data sources across the companies data world.
    It's doesn't keep historical data stored in data marts in a cleansed and unified way.
    SAP BW does all this - also when it runs on HANA.
    So, in my view, it might be a clever strategy to focus on the design of the current data warehouse so that it's easy to move it to HANA.
    ok - just my 2ct
    Lars

  • The future of Collaboration in SAP NW Portal?

    Hi all,
    I have a mandate from management to implement SAP Collaboration Rooms in our SAP NW Enterprise Portal (NW 7.0); basically, to increase the usability of EP for business users.  However, I have had a hard time convincing myself (and thus management) that SAP Collaboration Rooms would be the tool/product to use going forward (in terms of collaboration).
    First of all, maybe it's just me but I noticed that SAP have gradually separated and dropped the "C" from "KMC".  Then, 'collaboration' is hardly mentioned (as a feature/functionality) in portal's roadmap/release strategy, for releases after NW 7.0; in fact, the latest release strategy for LE only mentioned Duet Enterprise briefly.
    I went to the SDN's main page of 'Collaboration'
    http://www.sdn.sap.com/irj/sdn/nw-collaboration#section4
    only to find the 'Update on SAP NetWeaver Collaboration Services - Status Quo and Outlook' blog, which is supposed to provide up-to-date information, dated back in 2008.
    And then I found this blog (with a footnote on KMC)
    http://www.sdn.sap.com/irj/scn/weblogs?blog=/pub/wlg/17524
    to be very interesting.
    So, if I understand correctly, SAP expects its 'partner ecosystem to deliver solutions based on Duet Enterprise' (quote from the release strategy) and is encouraging its customers to use third-party software for collaboration such as MS SharePoint, etc.
    It'd be much appreciated if we can have an official confirmation from SAP about the future of SAP Collaboration Rooms in NW Portal and some suggestions as to what would be the best alternatives for its customers if we are to continue using SAP NW Portal.
    With all due respects, this thread is meant to have an open discussion (not to blame anyone or any product) and I'm seeking advice from experienced SDNers; so, all helpful suggestions are much welcomed/appreciated.
    Best regards,
    Dao
    Edited by: Dao Ha on Sep 30, 2010 3:03 PM
    Edited by: Dao Ha on Sep 30, 2010 3:29 PM

    The dual support of server and client object models are required.
    - the CSOM provides sandbox isolation, support for SharePoint as a SAAS platform (Office 365), and provides support for interoperability with other languages (via JSON, web services, etc)
    - the SSOM provides on-prem installations with more integration capabilities than can be provided via app models/etc. For example, I'll be developing a custom claims provider for ADFS integration starting Monday, which can ONLY be installed as a farm solution.
    Microsoft has been focused on updating SharePoint with newer server-side technologies. WebForms has been outdated for a while now, and SP has been working to include newer technologies such as MVC. This is evident in some places, such as search result templates,
    which have a very MVC-like approach.
    Scott Brickey
    MCTS, MCPD, MCITP
    www.sbrickey.com
    Strategic Data Systems - for all your SharePoint needs

  • The programming language of the future: a philosophical question

    So after reading some of these posts and reflecting on the history of programming languages, I find myself wondering what the programming language of the future will be? As the human-computer interface evolves (touch interfaces, voice recognition, computer vision, etc) how will this influence the process of creating computer programs? Will programming become more graphical (e.g. flow charts, UML diagrams, etc)? Setting ego's aside, does it make sense for programming to be more graphical? Will we reach a point where we can essentially just sketch out a concept and let the compiler/interpreter handle the rest?
    Perhaps this question is best left to Clark and Asimov, but they're dead. This leaves you guys.
    (Moderator, I apologize in advance if this is out of scope for this forum.)
    Last edited by bsilbaugh (2011-11-23 01:15:47)

    geniuz wrote:I think the mistake of the scientist in question was not that he re-derived the Trapezoid Rule, but that he published it as if it was a new invention, while it existed and was already published in some form. That certainly could be prevented by reconciling to literature before publishing. In fact, I'd not even blame the scientist in question too much, rather the commission that approved his paper to be published in the first place.
    Sure, he didn't publish the paper in isolation - but the peers who reviewed it would have been people from the same or a related profession. My point is that this kind of error results from splintering of the disciplines. You seem to think that the splintering should be somehow undone or reversed at the level of the peer reviewer or the publisher - but non-specialists (or specialists in other areas) will not be invited to comment, because it will be assumed that they lack the qualifications to do so.
    geniuz wrote:Still, I think its a long stretch to compare the derivation of a Mathematical rule or method to the invention of a complete DSL.
    I think I didn't make it clear where I was going with that. My point was that splintering of specialisations from each other permitted this duplication of effort to take place. I see DSLs as a way to splinter disciplines from each other. Thus, the rise of DSLs would make it easier for duplicate effort to take place.
    geniuz wrote:The whole point of a DSL is that it allows experts in a specific domain to focus their efforts on creatively solving problems directly related to their field of expertise. When specific DSL's are integrated in university curricula world-wide, I hardly think there will be much duplicate effort going on.
    Sorry, but this seems a little naive to me. I can see why you might think DSLs would permit specialists to more efficiently focus on solving their particular problems, and I don't entirely disagree, but even if it's true, I don't think it's controversial to think this will lead to considerably more duplicate effort.
    geniuz wrote:Also, in the world of OSS, there are multiple tools that can perform the exact same job. Do you for instance consider the existence of both of the mail clients Mutt and Alpine as duplicate effort?
    I haven't used Alpine, but if it's exactly like Mutt, then sure, I would certainly say that's duplicate effort. The more people who use one single piece of software, the more bugs can be filed against that software.
    Also, I think this is a false analogy from the start. Mutt and Alpine are both written in C. If we're talking about the connection between duplicate effort and DSLs, let's consider the fact that "communicating and storing computational models of biological processes" (from the Wikipedia page on SBML) could be better done in Lisp, Lex/Yacc, or with the Lemon C++ library, instead of coming up with some new way of using harmful XML.
    geniuz wrote:I still don't quite see how this can be interpreted as a necessary weakness or argument against more complex systems. Sure, DSLs can be dependent upon lower level languages, but if they are considered to increase the effectiveness and efficiency of certain experts, what exactly stops them from becoming dominant and continually evolving?
    The fact that higher and higher levels of specialisation are concomitant with smaller and smaller user bases who still need to communicate with experts in other groups. I'm not saying DSLs should never be used. I just think they are essentially self-limiting, and just as bacteria will long outlive us more complicated life-forms, so too will the lower-level languages long outlive more cumbersome DSLs.
    geniuz wrote:I wasn't implying computers will ever be able to mimic the human brain, and I'm not even sure whether it is something we necessarily want to strive for. All I was saying is that computers have already become indispensable tools in virtually every scientific and engineering discipline. They are computationally strong machines able to solve numerically involved problems at rates no human can ever hope to accomplish. It is this very aspect that will continue to guarantee the succes of computers, not AI per se. Again, I believe computers will never (at least not while I'm alive) be able to truly independently mimic and outperform the human brain, especially when it comes to aspects like creativity, i.e. the very aspects of human intelligence scientists have not even been able to understand and quantify to this date. Hence, humans will always remain "in the loop" to a large extent.
    This all seems reasonable.
    geniuz wrote:Don't forget that laws of physics are "laws" that have been defined and created by humankind for its own convenience. Even recently this year, practice has shown that a concept as fundamental as the speed of light might not be as accurate as it was so widely acknowledged by the scientific community. This however hasn't stopped mankind from using these fundamental "laws" to invent e.g. radio communication and electronic devices.
    I don't think you're disagreeing with me. I'm aware of the fragile nature of what we call the laws of physics - but like you say, they're good enough that we can do things with them. It looks like our knowledge of the speed of light breaks down on really large length scales, and it looks like our knowledge of gravity breaks down on really small length scales, but the brain is in the middle. In between, our models for how the physical world works are very accurate, and it is in this regime that the brain operates.
    Also, you should note that most scientists regard those neutrino test results as residing within the bounds of experimental error, and therefore not strongly indicative that the neutrinos really did break light speed.
    geniuz wrote:I haven't said low-level languages will be abandoned completely, I think they will remain to serve their purpose as a base upon which higher level languages (like DSLs) are built. In that sense, I believe that the user base of these low level languages will become more limited to computer scientists, i.e. to the people responsible for "formulating suitable abstractions to design and model complex systems" (source).
    Seems reasonable.
    geniuz wrote:Having said that, I still don't see why it so farfetched that for the rest of the world, physically telling a computer what to do in their native tongue as opposed to typing it in some generic-text programming language, will become the de-facto standard. Hence, I will reformulate my statement by stating that programming as most people know it today will eventually become a redundant practice.
    I don't think a friendly human interface should be considered the same as programming. This thread, if we recall the OP, is about programming languages of the future, not user interfaces of the future. I certainly agree that user interfaces will become more intelligent and attractive. I only disagree that this will have any strong impact on how we do programming.
    Last edited by /dev/zero (2011-12-12 19:20:59)

  • What is the future of AS2?

    I'd like to get some thoughts on this from people. Like for example, How Important is it to learn AS2 if you are in the process learning AS3?
    I started learning AS3 about a year ago with minimal knowledge of AS2. I have spent the last year just focusing solely on AS3 and as I continue my education in flash programming is there any substantial benefits from spending any time on AS2? The way I've been looking at it is, if I'm going to learn a new technique in Flash I might as well learn in it AS3.
    Any thoughts?
    Thx.

    It depends partly on your plans for the present and the future.  I you plan to be doing freelance design work, then learning AS2 may become more of a necessity than a wishful bit of thinking.  You may end up inheriting older projects that need revisions or new features.  While the same could be said were you to be working for some functional group of some large firm, chances are more likely you would be developing new things rather than changing old things, so learning old things becomes less necessary.  Still, in any situation it is always good to have some ability to speak another language.
    The good thing about learning AS3 is that you are still learning some of the unique coding elements that relate solely to Flash design.  And knowing them, you become knowledgeable as to what capabilities might exist and what you might need to look for in older versions of the language.  My view of programming and learning languages is... you learn coding concepts and capabilities that can be iinherantly applied to new languages... it is just a matter of determining what syntax/code a new language uses to do what you know it should be able to do based on experience with another language.

  • The future of UIX?

    Given the focus on ADF Faces in the 10.1.3 release of JDev, can Oracle make comment on what the future of UIX will be please?
    The following document:
    http://www.oracle.com/technology/products/jdev/collateral/papers/9.0.5.0/adfuix_roadmap/adfuix_roadmap.html
    ....notes that a "A migration tool to upgrade from ADF UIX to ADF Faces will be available in the future". Is this still planned? Does this mean given the 10.1.3 release our JDev production systems (pre 10.1.3), in order to take advantage of bug fixes elsewhere in the technology stack, must use this migration tool and convert their existing applications? (with the associated regression test costs) Or will UIX be supported and extended in the future?
    CM.

    Hi Shital and Marcus,
    We will continue to support and develop UIX for a very long time (it is currently used extensively within our eBusinessSuite). Internally ADF Faces is known as UIX 3.0, and we see ADF Faces as a natural evolution of UIX.
    Marcus: "But will there be a feature freeze for UIX or do you plan to have the same functionality / components in UIX and JSF?"
    A: Looking at UIX today and what we have in ADF Faces the functionality/components are almost identical. There are of course still some work to be done on the ADF Faces side. They are after all just in Early Access release. Eventually there will be a feature freeze of UIX, and you will have to make a decision whether to move to JSF or not, but as I said the UIX framework will be around for a very long time.
    Marcus: "Your statement "same developer experience as in 9.0.5" sounds like there will be improvements for JSF that will not be implemented in UIX?"
    A: Correct. There is a difference between UIX and JSF that we cannot overlook. JSF allows us to provide more design time features than what would be feasible with UIX.
    Shital: "In your opinion should we start using ADF Faces for our new projects and not use UIX?"
    A: There is no clear answer to this, yet. If your requirements are to use production software and your projects have deadlines of late summer- early fall 2005 I would definitely use UIX. If your projects are not starting until after summer I would start looking at ADF Faces. Remember that there is currently no production release of JDeveloper that has the same feature set for ADF Faces as with ADF UIX. Features like drag and drop data bindings using ADF model (JSR 227) are not there, yet, for ADF Faces.
    Summary
    If you are a front runner (and can wait) go and explore JSF and ADF Faces. If you are depending on production software and cannot wait - use ADF UIX.
    - Jonas
    JDev team

  • The Future of SAP HCM and New Podcast Series

    There is a new monthly podcast series focused on SAP HCM and I thought that may be of interest to forum members. We would love to get your thoughts (in comment section) on who you would like to see as a guest as well as topics for us to cover.
    /people/jarret.pazahanick/blog/2011/09/28/the-future-of-sap-hcm-and-new-podcast-series

    Freddie,
    As many SAP clients had reached a maturity level to archive their structured(DB data) & unstructured (Document data) OpenText is better option.  
    Yes, as Deepak Quoted xECM provides many features like:
    360° View on SAP and non-SAP content  
    Collaboration between SAP and non-SAP user
    Business user efficiency with multiple user frontends
    End to End Content Lifecycle Management
    ECM Consolidation on a single platform
    Records Management for SAP & non-SAP content
    Legacy Decommissioning
    Optimize IT & reduce TCO
    Access to Content anywhere/anytime
    Faster processes with SAP Workflow
    Eliminate Paper Inefficiencies
    Disaster Protection of Content
    Secure long-term Storage
    While SAP DMS has it is own merits xECM MAY win. 
    Thanks,
    Kolusu

  • Bug? My events on the iPad iCal app aren't shown in the year view if they are more than two years in the future.

    My events on the iPad iCal app aren't shown in the year view if they are more than two years in the future even though I can see them on the month, week and day view. Any suggestions on how to fix it? I've tried it all. I called the apple support and they checked on their iPads. They all did the same and they couldn't help me. They suggested trying this way. I'd like to be able to plan a few years ahead and the year view would make thing so easy!
    Is this a bug?

    Go to the Home screen and double click the Home button. That will reveal the row of recently used apps at the bottom of the screen. Tap and hold on the app in question until it wiggles and displays a minus sign. Tap the minus sign to actually quit the app. Then tap anywhere on the screen above that bottom row to return the screen to normal. Then restart the app and see if it works normally.
    Then reboot your iPad. Press and hold the Home and Sleep buttons simultaneously ignoring the red slider until the Apple logo appears. Let go of the buttons and let the iPad restart. See if that fixes your problem.

  • When iTunes upgraded, many of my song tags were changed by iTunes.  This happened again when an update was sent this week.  Is there any way to avoid this happening again in the future?

    When I downloaded the new iTunes, it changed the tags on some of my tracks.  As a result, I had to fix them manually.  When I downloaded another upgrade this week, my tags changed again.  How can I prevent this from happening in the future?

    The cricual thing is to have a complete backup of your library that you could restore if needed.
    What changes are you noticing? There appear to be some new as yet not well documented features designed to sync data across multiple libraries/devices which may have unwanted side effects.
    tt2

  • How to Send an Email at a Specific Time in the Future Using Mozilla Thunderbird.

    Hi guys,
    I downloaded Thunderbird for my Mac for one single purpose, simpy to be able to send an email by executing the time and date i wish my email to be send, automatically. The extension saves the message to draft and monitors the messages in the draft folder, when the chosen time arrives it moves the message to unsent and sends unsent messages. So i can send messages in the future.
    I dont know how to manage this, and i would appreciate help and tutorials from you guys!

    Why not read Thunderbirds tutorials and post in their forum area?

  • The Future Of The N900?

    Hello,
    It was hard enough for me to fathom out what was going on in favor of the N900 before I heard about the Microsoft announcement.
    It seems Nokia have got bigger concerns for the future of themselves than to be concerned about a past  product?
    As a layman it seems the N900 is good phone but will be ultimately be limited by it's processor. I assume like computers this will increase unless there are limitations.
    BUT, it does seem that all these names I've heard of since owning a N900 - Maemo, Meego, Moblin, Intel, Qt, Ovi Beta do not  represent a part of the future in the grand scheme of things for Nokia. I can understand that if Microsoft is necessary for survival.
    I have no experience of Ipod / HTC Andriod but it does seem these have stolen the march whilst Nokia have been floundering. I really don't know if I move away from the Nokia N900 if I would be pleasantly surprised or wish for it back.
    In the meantime if the N900 has a future interesting or otherwise it would be good to know as upgrade is round the corner and unless the competition are not to be missed, I am happy to stick with the N900 as it does the business for me.
    The thing is I shouldn't need to be thinking like this but it is not obvious to me where the Nokia successor to it is.
    I am very open to responses to my thoughts, which may need correcting. I might not even be on the right track, but it's funny how I am the only person in my group of friends with a Nokia, never mind a N900.
    I would also like to mention that cpitchford has helped me in the past and I consider him to be the man in the know by far. I was gutted to read a recent post how he felt let down. 
    I favor quality brand loyalty but this seems to questionable these days. 

    hi,
    i do agree with you!! 
    even after losing my 1st nokia n900, i cud not digest any other phone as similar and doing-business for me.
    hence to asnwer all easy options i bought again my 2nd n900.
    Certainly there is no match to its convinence, leave apart its processor. 
    I am also amongst those who r eagerly waiting for new version, but the possibility seems to b diminishing .
    hope nokia will revive this exclusive-not-for masses-smartphone!!!

Maybe you are looking for