Advice needed: is BDB a good fit for what I aim at?

Hello everyone,
I'm not a BDB user (yet), but I really think that this the BDB library
IS the perfect fit for my needs.
I'm designing an application with a "tricky" part, that requires a very fast
data storage/retrieval solution, mainly for writes (but for reads too).
Here's a quick summary of this tricky part, that should at least use
2 databases:
- the first db will hold references to contents, with a few writes per hour
(the references being "pushed" to it from a separate admin back end), but
expected high numbers of reads
- the second db will log requests and other events on the references
contained in the first db: it is planned that, on average, one read from DB1
will produce five times as much writes into DB2.
To illustrate:
DB1 => ~25 writes / ~100 000 reads per hour
DB2 => ~500 000 writes / *(60?) reads per hour
(*will explain about reads on DB2 later in this post)
Reads and writes on both DBs are not linear, say that for 500 000 writes
per hour, you could have the first 250 000 being done within 20 minutes,
for instance. There will be picks of activity, and low activity phases
as well.
That being said, do the BDB experts here think that BDB is a good fit for
such a need? If so or if not, could you please let me know what makes you
think what you think? Many thanks in advance.
Now, about the "*(60?) reads per hour" for BD2: actually, data from DB2
should be accessed in real time for reporting. As of now, here is what
I thing I should do to insure and preserve a high rate throughput not to
miss any write in DB2 => once per minute another "DB2" is created that will
now record new events. The "previous" DB2 is now dumped/exported into another
database which will then be queried for real-time (not exactly real-time,
but up to five minutes is an acceptable delay) reporting.
So, in my first approach, DB2 is "stopped" then dumped each minute, to another
DB (not necessarily BDB, by the way - data could probably re-structured another
way into another kind of NoSQL storage to facilite queriing and retrieval
from the admin back end), which would make 60 reads per hour (but "entire"
reads, full db)
The questions are:
- do you think that renewing DB2 as often would improve or strain performances?
- is BDB good and fast at doing massive dumps/exports? (OK: 500 000 entries per
hour would make ~8300 entries per minute on average, so let's say that a dump's
max size is 24 000 rows of data)
- would it or not be better to read directly into the current DB2 as it is
storing (intensively) new rows, which would then avoid the need to dump each
minute and then provide more real-time features? (then would just need a daily
dump, to archive the "old" data)
Anyone who has had to face such questions already is welcome, as well as
any BDB user who think they can help on this topic!
Many thanks in advance for you advice and knowledge.
Cheers,
Jimshell

Hi Ashok
Many thanks for your fast reply again :)
Ashok_Ora wrote:
Great -- thanks for the clarification.Thank YOU, my first post was indeed a bit confusing, at least about the reads on DB2.
Ashok_Ora wrote:
Based on this information, it appears that you're generating about 12 GB/day into DB2, which is about a terabyte of data every 3 months. Here are some things to consider for ad-hoc querying of about 1 TB of data (which is not a small amount of data).That's right, this is quite a huge lot of data, and will keep growing, and growing... Although the main goal of the app is to be able to achieve (almost) real time reporting, it will also need to be able (potentially) to compute data over different time ranges, including yearly ranges for instance - but in this case, the real time capabilities wouldn't be relevant, I guess: if you look at some data on a year span, you probably don't need it to be accurate on a dayly interval, for instance (well, I guess), so this part of the app would probably only use the "very old" data (not the current day data), whatever it is stored in...
Ashok_Ora wrote:
Query performance is dramatically improved by using indexes. On the other hand, indexing data during the insert operation is going to add some overhead to the insert - this will vary depending on how many fields you want to index (how many secondary indices you want to create). BDB automatically indexes the primary key. Generally, any approach that you consider for satisfying the reporting requirement will benefit from indexing the data.> Thanks for pointing that out! I did envisage using indexes, but my concern was (and you guessed it) the expectable overhead that it brings. At this stage (but I may be wrong, this is just a study in progress, that will also need proper tests and benchmarking), I plan to favour write speed over anything else, to insure that all the incoming data is indeed stored, even if it is quite tough to handle in the primary stored form.
I prefer to envisage (but again, it's not said that it is the right way of doing it) very fast inserts, then possibly re-process (sort of) the data later, and (maybe? certainly?) elsewhere, in order to have it more "query friendly" and efficient for moderately complex queries for legible reports/charts.
Ashok_Ora wrote:
Here are some alternatives to consider, for the reporting application:
- Move the data to another system like MongoDB or CouchDB as you suggest and run the queries there. The obvious cost is the movement of data and maintaining two different repositories. You can implement the data movement in the way I suggested earlier (close "old" and open "new" periodically).This is pretty much "in line" with what I had in mind when posting my question here :).
I found out in several benchmarks (there are not a lot, but I did find some ^^) that BDB amongst others is optimized for bunch queries, say that retrieving a whole lot of data is faster that, for instance, retrieving n times the same row. Is that right? Now, I guess that this is tightly related to the configuration and the server's performances...
The process would then feed data into a new "DB2" instance every 60 seconds, and "dumping"/merging the previous one into another DB (BDB or else), which would grow until some defined limit.
Would the "old DB2" > "main, current archive" be a heavy/tricky process, according to you? Especially as the "archive" DB is growing and growing - what would be a decent "limit" to take into account? I guess that 1TB for 3 months of data would be a bit big, wouldn't it?
Ashok_Ora wrote:
- Use BDB's SQL API to insert and read data in DB1 and DB2. You should be able to run ad-hoc queries using SQL. After doing some experiments, you might decide to add a few indices to the system. This approach eliminates the need to move the data and maintaining separate repositories. It's simpler.I read a bit about it, and this is indeed very interesting capabilities - especially as I know how to write decent SQL statements.
That would mean that DB2 could grow more than just within a 60 seconds time span - but would this growing alter the write troughput? I guess so... This will require proper tests, definitely.
Now, I plan the "real" data (the "meaningfull part of the data"), except timestamps, to be stored in quite a "NoSQL" way (this term is "à la mode"...), say as JSON objects (or something close to it).
This is why I envisaged MongoDB for instance as the DB layer for the reporting part, as it is able to query directly into JSON, with a specific way to handle "indexes" too. But I'm no MongoDB expert in any way, so I'm not sure at all, again, that it is a good fit (just as much as I'm not sure right know what the proper, most efficient approach is, at this stage).
Ashok_Ora wrote:
- Use the Oracle external table mechanism (Overview and how-to - http://docs.oracle.com/cd/B28359_01/server.111/b28319/et_concepts.htm) to query the data from Oracle database. Again, you don't need to move the data. You won't be able to create indices on the external tables. If you do want to move data from the BDB repository into Oracle DB, you can run a "insert into <oracle_table> select * from <external_table_in_DB2>;". As you know, Oracle database is excellent database for all sorts of applications, including complex reporting applications.
This is VERY interesting. VERY.
And Oracle DB is, you're, a very powerful and flexible database for every kind of processes.
I'll look into the docs carefully, many thanks for pointing that out (again!) :)
I have not yet decided if the final application would be free nor open source, but this will eventually be a real question. Right now, I don't want to think of it, and just find the best technical solution(s) to achieve the best possible results.
And BDB and Oracle DB are very serious competitors, definitely ;)
Ashok_Ora wrote:
Hope this was helpful. Let me know your thoughts.It definitely is so much useful! Makes things clearer and allow me to get more into BDB (and Oracle as well with your latest reply), and that's much appreciated. :)
As I said, my primary goal is to insure the highest write throughput - I cannot miss any incoming data as there is no (easy/efficient) way to re-ask for what would be lost and get it again being sure that it hadn't changed (the simple act of re-asking would induce data flaws, actually).
So, everything else (including reporting, stats, etc.) IS secondary, as long as what comes in is always stored for sure (almost) as soon as it comes in.
This is why, in this context, "real" real-time is not really crucial, an can be "1 minute delayed" real time (could even be "5 minute delayed", actually, but let's be a bit demanding ^^).
Ashok_Ora wrote:
Just out of curiousity, can you tell us some additional details about your application?Of course, I owe you a bit more details as you help me a lot in my research/study :)
The application is sort of a tracking service. It is primarily thought to serve the very specific needs of a client of mine: they have several applications that all use the same "contents". Those contents can be anything, text, HTML, images, whatever, and they need to know almost in real time what application (used by which external client/device) is requesting ressources, which ones, from where, in which locale/area and language, etc.
Really a kind of "Google Analytics" stuff (which I pointed out at the very beginning, but they need something more specific, and, above all, they need to keep all the data with them, so GA is not a solution here).
So, as you can guess, this is pretty much... big. On the paper, at least. Not sure if this will ever be implemented one day, to be honest with you, but I really want to do the technical study seriously and bring the best options so that they know where they plan to go.
As of me, I would definitely love it if this could become reality, this is very interesting and exciting stuff. Especially as it requires to see things as they are and not to fall into the "NoSQL fashion" for the sake of being "cool". I don't want a cool application, I want an efficient one, that fits the needs ;) What is very interesting here is that BDB is not new at all, though it's one of the most serious identified players so far!
Ashok_Ora wrote:
Thanks and warm regards.
ashokMany thanks again, Ashok!
I'll leave this question opened, in order to keep on posting as I'm progressing (and to be able to get your thoughts and rewarding comments and advice above all :) )
Cheers,
Jimshell

Similar Messages

  • What's a good fit for me?

    I want to know which SAP module is the best fit for me. I have a masters in IT and an MBA in finance. I am currently working in BI (for a while now) and I like it. But I want to better utilize my MBA and stay in SAP. Finance interests me a lot. However I did not find the FICO module interesting, it seemed more Accounting. Any other modules that are a good fit for finanace graduates? Thanks.

    figured it myself.

  • I loaded iTunes on a new laptop. I copied and pasted my albums to a 4G stick and down loaded them to the new laptop. Placing them in, Libraries – Music – iTunes and than in Itunes used File – add file to library. So far so good. For what ever reason one o

    I loaded iTunes on a new laptop. I copied and pasted my albums to a 4G stick and down loaded them to the new laptop. Placing them in, Libraries – Music – iTunes and than in Itunes used File – add file to library. So far so good. For what ever reason one of the CDs would not load onto the stick. I got the orginal CD and loaded directly to Itunes. Now when I look in  Libraries – Music – iTunes it’s not there but it is in and plays in iTunes. What should I have done or should I be doing? Thanks.

    It's not a unique situation - happens all the time, smaller hard drives getting filled up.
    It's also easy to fix. I just moved my itunes content to a bigger hard drive.
    First I made a backup of the itunes database file - iTunes Library.itl. Just in case anything went wrong.
    Nothing did go wrong, but I'd have it in case anything DID go wrong. That file holds ratings, playcounts, Date Added, and other metadata.
    Second, in itunes change the preferences > advanced tab to tell itunes to put everything on G:
    Set the itunes folder location to something like G:Media or G:iTunes
    This will affect all FUTURE additions to itunes and not do anything to the existing locations.
    Make sure the checkbox for "Keep itunes organized" is NOT checked. I also left the "Copy files when adding" unchecked.
    Third, to move all the files to G: without itunes losing the paths, run the File > Library > Organize > Consolidate command. This will need to run a while, so be patient as it copies all the files to the new G: location. This does NOT delete the originals. You will have to do that.
    Fourth, make sure everything in itunes is OK and you can play stuff in your library. If you're really cautious you can run a script to check for missing/dead tracks.

  • Advice needed about lens correction plugin's for Aperture 2.1

    Please can I have some advice about lens correction plugin's for Aperture 2.1?
    I see that the Apple Aperture Downloads site promotes two:
    1)LensFix 4.3
    An external editor. The Aperture Downloads site calls it Lens Fix 4.3 but when you go the developers site (Kekus Digital) it only shows LensFix CI is this the same thing? Am I supposed to download the PTLens database from epaperpress.com and install manually before it is any good?
    OR
    2)Edit FixLens 1.6
    Their site looks very poorly designed and the link from the Apple page does not go to the Aperture plugin, you have to drill down until you find it! Does it use the exif data about your lens and the zoom setting it was on automatically or do you need to configure it each time?
    Which one did you buy? Was it worth it?
    Thanks for your time!

    LensFix CI is the same thing. It has the standalone version, the PS plug-in and the Aperture plug-in.
    I've used LensFix via PS for a long time. It's great but... it can be a bit buggy if the image in use is 16-bit. Often it will work on the first image or two and then crash on the next.
    I am told it's not compatible with PS CS4.
    Another option is PTLens. It's also a PS plug-in and an Aperture plug-in. It's supposed to be CS4 compatible.
    http://epaperpress.com/ptlens/index.html
    It uses the same database of lens corrections as LensFix.
    FWIW, I've tended to do this kind of work in Photoshop, either round tripping an image in Aperture or exporting an image from Aperture. Since Aperture plug-ins are going to 16-bit Tiff files anyway and since most images (if they are final files for clients) need some more PS work, I just go that route.
    Jon Roemer
    site: http://www.jonroemer.com/
    blog: http://jonroemer.typepad.com/jon_roemer/

  • Advice needed - optical bay caddy and HDD for T530

    Since I rarely use my optical drive and frequently back up files to an external hard drive, I'd like to put a HDD in the optical bay and just swap it out for the DVD player whenever that is needed.
    My T530 (i5-3210M Win7HP) has a 256GB Crucial mSSD for the boot drive and programs;  the original 500GB/7200 rpm in the drive bay has data files, Outlook and Quicken files, music, photos and documents.
    For the optical bay HDD 500 GB would be plenty, but the 750 GB is only about $10 more, so why not?  (I'd probably put the 750 in the drive bay and make it the data drive, with the 500 in the optical bay for a back up drive... and this is about convenience more than need.)
    Best I can tell the optical drive supports SATA 6GB/sec but most of the 2.5" 750GBx7200 rpm drives are SATA 3GB/sec.  (On the other hand, the Hitachi MK5061GSY 500GBx7200 that came in my T530 appears to be 'only' SATA II, and most forum posters suggest there's not a noticable difference.)
    Of the 750 GB drives it seems the Seagate Momentus ($82) and Western Digital Scorpio Black ($78) would be the top two choices.  I think they're both 9.5mm and would fit. Some reviews indicate the Western Digital is quieter.
    The caddy is more problematic, as there are so many choices, and many appear to be poorly-fitting Chinese knockoffs.  I'm not inclined to pay $70 for the Lenovo, when the others are in the $10-15 range, but I do want one that fits well and works well and lasts.  (And is at least SATA 3GB/sec.)  I can't tell much from the prices or pictures, and there's contradicting information in the 'product reviews.'
    So I need a 12.7mm optical bay (Ultrabay) caddy for a 9.5mm HDD that fits well without gaps, matches the T530 case, and supports at least SATA 3GB/sec.
    Will I need the rubber 'bumpers' or is that just for SSD drives?  If I need them, do they come with the caddy or drive?  If not, what do I get and where?
    Does either the Seagate or Western Digital offer an advantage in function or reliability?
    Please correct anything that I've mistated or misunderstood.  Thank you for your advice and recommendations.
    Solved!
    Go to Solution.

    How the four screws work. 
    HD's and SSD's usually have four threaded holes on the bottom and also two threaded holes on each long side. The video you may have seen with the screws going through holes in the bottom of the caddy into the threaded bottom holes of the HD is a video of the $45 Newmodus caddy. That's how they affix the drive to the caddy.
    The Chinese caddy I linked uses the screws in a different way, and it is actually pictured in the top right illustration on the caddy.
    While the drive is out of the caddy, you just thread the screws (through nothing) into the side holes on the drives. So, the drive now has two little protruding screw heads on each side. The only purpose is to create those four slight protrusions - two on each side. 
    Then you insert the drive into the caddy by sliding from one end. The thing on the bottom of the caddy picture is like a little door that folds up and down. You lift up the door and slide the drive into the other end. The protruding screw heads slide under little rail protuberances that are molded inside the sides of the caddy. When you have slid the drive all the way forward into the electrical connectors, you then snap down the door behind it. The door prevents the drive from moving forward or back. The screw heads wedged under the rails prevent the drive from moving up or down. You can do the entire operation in 5 seconds.
    Ny-compu-tek seems to have the same caddy listed under various titles and prices, including one that specifically says T530. I think that's just a marketing gimmick, as the T530 takes the same caddy as the T520. Anyway the one I linked is the least expensive from that source and fits perfectly. 
    It was also shipped out in one day with a tracking number. 

  • Advice needed with Audio Punchin-in workflows for News

    I'm testing a new FCP installation for a News Broadcaster and reached a major stumbling block with the recording of Voice Over in tight turnaround situations. Previously the broadcaster fully utilized the 'Audio Punch-in' functionality of Avid Media Composer but we are yet to find a satisfactory equivalent in FCP or Soundtrack.
    The Avid workflow currently involves playing out our programme mix on Tracks 1 and 2 to our external mixer whilst simultaneously recording the Programme / VO mix from an external mixer on Avid Mojo i/p 3 and 4 onto tracks 3 and 4. They also record the clean VO from the external mixer via Avid Mojo i/p 5 and 6 to tracks 5 and 6. At any point the journalist could make a vo mistake so they would stop the vo recording and change the punch-in in point on the timeline and proceed to record over the required portion of vo with a defined playback preroll before the vo begins to record.
    These tools ensure they can very quickly record vo and fix mistakes on the fly without the need for further editing. The broadcaster relies on this functionality for tight turnaround bulletins and without an equivalent toolset in FCP this could be reason enough not to make the switch to FCP :O(
    To summarize they need the following features for voice over:
    - simultaneous recording of 4 audio channels
    - playing back the recorded track during pre-roll and then switching over to the inputs like a kind of EE operation
    - defining the tracks for recording and overwriting the previously recorded material if you record the same part again.
    Has anyone on this forum found a way to record VO like this on FCP or Soundtrack? Any advice would be gratefully received.
    Independent Editor, Technician and Trainer
    <Edited by Host>

    Open the Help menu in FCP, and search the online user's manual with the words "Voice Over Tool"... FCP has something similar to Avid's tool... you can keep nat sound etc, and record VO right to the timeline window. Even do multiple takes. Video can be routed externally to a monitor in a VO booth too during the record. The talent can have a headphone as well.
    Jerry

  • Advice needed on choosing the best SSD for 13-inch macbook 5,1.

    I'm thinkning about to upgrade my macbook. It's a late 2008 aluminum version. I want to replace the old HD with a nice SSD. I'm struggling to choose from the following two SSD.
      1. Crutial M500 (960G)  SATA III 6Gbps
      2. Intel dc s3700 (800G)
    I can get both of these at similar price around 300pounds just for this 2 days. But I don't know which would work the best on my macbook. Need advice please.

    Thanks for the link, unfortunatly there's no option for intel dc s3700.
    After seeing crucial M500's score there, I'm not that empressed. I heard that the Intel dc s3700 (800G) worth over 1000pounds, is it worth it just for the sake of getting the intel at a much cheaper price?  I don't know Why I've got the impression that Intel dc S3700 acturallly better than crucial M500.

  • Advice needed on which processor to buy for a Mega PC Pro

    Hello all, just registered,
    Could anyone recommend which processor I should buy for the MEGA PC 865 Pro I just ordered? I can see a Intel Pentium 4 630 3 Ghz Socket 775 800Mhz 2mb Retail Boxed on Ebuyer for £124 which is my price range but I'm not sure why it's good. The only criteria I have is that I will be doing video editing and I want my PC to stay quiet so I suppose there will be a trade-off somewhere. I'm confused about why there are so many processors with so little apparent difference between the.
    Also is Windows XP really the best software for a Pentuim 4? I prefer Windows 2000 but I read that XP takes advantage of Hyperthreading. BUt will I really notice any difference?
    Thanks

    richmitch, the thread you referred to is specifically for the Mega 865
    the 865PRO is a slightly different kettle of fish
    Win2K should work with no problems, not sure about the hyperthreading though, MSI just say that they only support customers with WinXP though; you can still get drivers for Win2K though
    having said that, Win2K and WinXP are still very much the same. if its the graphical look of WinXP you don't like, you can always set it to the "Classic" look, and it looks and feels like Win2K anyway
    all info on Mega 865PRO, click to links at left for drivers, CPU support etc.
    http://www.msi.com.tw/program/products/slim_pc/slm/pro_slm_detail.php?UID=602
    you can find info on differences between P4 CPUs in this chart
    http://www.intel.com/products/processor_number/index_view_p4.htm

  • Need help finding a good monitor for a Power Macintosh G5 Dual Core (2.0)

    Hi
    I've owned Macs since the 90's - the past 7 years have been using Imacs...  I'm going to purchase a Power Macintosh G5 Dual Core (2.0) with the Nvidia 6600LE card.  Whats a decent monitor for it?  It's DVI output only as far as I can tell - I want at least a 22 or 24 inch monitor and want a fairly inexpensive one as I'm a poor man (Social Security disability).  Can anyone suggest one - I can't tell what screen resolutions are optimal for the card (googled till my eyes hurt) if I knew that I could start hunting CompUSA, Sams Club, etc.

    Check out newegg.com. I just got the same G5 you have this past summer and a couple months ago finally decided it was time to ditch the old 21" CRT Viewsonic Ive been using for years.
    I picked up a 21" Acer widescreen HD LCD on sale for $119.00 from newegg and could not have been happier..
    Ive never bought anything from Acer before but its so far got no complaints from me other then, the DVI cable they give you is the wrong type to use with your G5's video card. Lucky for me it had a DVI-VGA adapter so I could get running. Its running just fine at 1900x1200 without any dual link DVI connecter.
    One thing I like about newegg is you get to read user reviews and they are not afraid to tell you if something is bad. Theyve always got different models and brands rotating on and off sale so check em out and make sure you check the reviews and you can find a killer deal.
    I think the Viewsonics I looked at in the same size were around $199 or so.. I wouldve got one but Im a broke and jobless kinda guy nowadays so had to get one as cheap as I could..

  • I need a variety of good music for Imovie - any suggestions?

    Hello,
    I am creating a DVD for my friends but I am struggling with finding fun, great movie music that I haven't already used. Does anyone know of a website or songs that work well for movies?
    Your help is greatly appreciated.
    Thanks!

    .. songs that work well for movies?..
    Slipknot/OMS Freddie vs Jason (skatermovie...?)
    Sinatra/My Lady (portrait of a charming woman.. ?)
    Albinoni/Adagio for Strings&Organ (funeral of a friend... ?)
    Scott Joplin/ any rag (Keystone Cops chase - alike ... ?)
    ... music is THE source of emotion in any project.. so, without knowing the 'tone' of your movies, it is hard to give any recommendations ...

  • Need some Help, Credit Rating effected for what I think is wrong?

    Hi All  Been a wacther on here for last 12 months or so and in July of last year I purchased my first new home.  Credit was around 700, In doing so with furnishing a new home etc I for sure went overboard and got alot more credit cards than i should have. Now its not an issue that i cant "pay" them, i always pay everything on time, and more than min. But i am in a place now that has my Credit Balances close to limit and my only goal is to get rid of all my credit card debt over the next 12-24 months.. Lets say $30,000 total. My concern is that my credit score is now around 625. My problem with this is, it shows  as the first reason why  "SERIOUS DELINQUENCY" i can tell you without a doubt in the past 6-8Years I have not missed a single payment, on anything, ever... Back in college i had one captial one card and yes i did "settle to pay" at some point but that was SO long ago.. I did have verizon cable bill from when i moved and they couldnt forward my mail and ended up paying that $125 last year prior to buying my house, but that was all just a mistake as i thought i was paid in full, is this what is making the SERS DELN? wonder if Verizon would be able to remove it since i paud it a year ago? Not even sure thats it? Can anyone please advise on what I should do about this??  and I mean just this, I well know that i have to pay my CC debt off 100%, and I will.. Im looking for help on why it reflects SERIOUS DEL> when i know i dont have any?? TY ALL  

    that is true but if i didnt have a dvd or dvd buner to hand i would still be able to tranfer files etc. and i would still have the mp3s at home so i wouldnt mind losing them temp. plus if there was a dock i could have dropped it in that :P
    ok i take your point about you would have no room for music. however the real question is why do you have to have a fixed removable dri've? as i said i am using the MTP firmware so i dont have a partition i just mix files and music. this way i should be able to have a 4.7GB file and 300meg of music or 2.5GB of files and 2.5GB of music. why should it be 2.5GB of music but with a setting of 4GB for the dri've and thus GB of music thus not being able to use the other .5GB for music?
    as someone said ipods do pump out track data etc through the bottom port so you get all your info on you car stereo. the only way i see it happening on the zen is if creative make the USB a I^2C port during playback but this is prob not possible and very unlikely given current firmware releases.
    dont get me wrong, i picked my zen over an ipod, looks better, has mic, has 5GB storage etc but it just doesnt make the most of its self and im not sure if it even will due to bad design which almost seems like 'apple have put there port on the bottom, lets put ours at the top, yeah that will make us better than apple'.

  • Need good template for Artist - Must have GOOD instructions

    Hi.
    I'm not a total Newbie, but I'm looking for a nice template for artists. The important thing is that it has really good instructions for customizing it. I'm not totally new, but I don't have the time to learn right now so the instructions need to be really basic. No "assuming" I already know CSS.
    It needs to have the ability to show lots of pictures and blow them up when clicked. I'm willing to pay for it as long as it's doable.
    Thanks.

    I haven't thought about using Wordpress. I know a lot of people use it. Would there be a steep learning curve going that route?
    Any learning curve is subjective. Try it and see if it's a good fit for you.
    Wordpress is widely used by novices and experts alike.
    Actually, I remember now that I did look into it but found that being on a Mac there were some extra steps involved getting started  and I couldn't find explanations or definitions for some of the terms.
    There are no extra steps for a Mac compared to a PC that I'm aware of.
    Ask specific questions about terms and you'll get specific answers.

  • Partial Goods Receipt for Inbound Delivery

    Hello Frends,
    We have a requirement to generate partial goods receipt against Inbound delivery created in Intercompany scenarios, however when we analzed there two alternative ways possible -
    1.  Goods Receipt in Inbound Delivery - Transaction VL32N : In this transaction,ssytem allows goods receipt for complete quantity but not for partial quantity
    2.  Goods Receipt reference to Inbound Delivery - Transaction MIGO : In thsi transaction, there is a provision to generate partial goods receipt however system not updating the status in inbound delivery document.
    In this regard, requesting your support on what is possibility to fulfill our requirement, is there any standard SAP approach for this.
    Appreciate your support.  Thanks
    Best Regards,
    Goutham

    Hi,
    Thanks for your quick reply..
    I have a outbound delivery with 100 EA, now system automaticallly generates a Inbound delivery for 100EA using output message.  Now we need to generate partial goods receipt for the inbound delivery, mean first 20, then 40 and the remaing.
    What are the best practices available since when I check from the Inbound delivery,we cannot make partial goods receipt.
    Best Regards,
    Goutham

  • Goods Receipt for PO using BAPI

    Hello,
    I need to do a goods receipt for a purchase order using bapi_goodsmvt_create for movement type 101. My function module does not return the material document number. Can someone help me with my code?
    Thanks,
    A.P.
    FUNCTION Z_GR_BAPI.
    ""Local interface:
    *"  IMPORTING
    *"     REFERENCE(I_CODE) TYPE  GM_CODE
    *"     REFERENCE(I_MOVE_TYPE) TYPE  CHAR3
    *"     REFERENCE(I_EKKO) TYPE  EKKO
    *"  EXPORTING
    *"     REFERENCE(E_DOCUMENT) TYPE  BAPI2017_GM_HEAD_RET-MAT_DOC
    *"     REFERENCE(E_RETURN) TYPE  BAPIRET2
    *"  TABLES
    *"      T_EKPO STRUCTURE  EKPO
    Structures for BAPI
      data: gm_header  type bapi2017_gm_head_01.
      data: gm_code    type bapi2017_gm_code.
      data: gm_headret type bapi2017_gm_head_ret.
      data: gm_item    type table of
                       bapi2017_gm_item_create with header line.
      data: gm_return  type bapiret2 occurs 0.
      data: gm_retmtd  type bapi2017_gm_head_ret-mat_doc.
      data: x_ekpo type table of ekpo with header line.
       x_ekpo[] = t_ekpo[].
      clear: gm_return, gm_retmtd.
      refresh gm_return.
      gm_code-gm_code = i_code.
      gm_header-pstng_date = sy-datum.
      gm_header-doc_date = sy-datum.
      gm_header-pr_uname = sy-uname.
      loop at x_ekpo.
        gm_item-material = x_ekpo-matnr.
        gm_item-plant = x_ekpo-werks.
        gm_item-stge_loc = x_ekpo-lgort.
        gm_item-move_type = i_move_type.
        gm_item-mvt_ind = 'B'.
        gm_item-stck_type = space.
        gm_item-entry_qnt = x_ekpo-menge.
        gm_item-entry_uom = x_ekpo-meins.
      append gm_item.
      endloop.
      CALL FUNCTION 'BAPI_GOODSMVT_CREATE'
        EXPORTING
          GOODSMVT_HEADER             = gm_header
          GOODSMVT_CODE               = gm_code
         TESTRUN                     = ' '
        IMPORTING
          GOODSMVT_HEADRET            = gm_headret
          MATERIALDOCUMENT            = gm_retmtd
        MATDOCUMENTYEAR             =
        TABLES
          GOODSMVT_ITEM               = gm_item
        GOODSMVT_SERIALNUMBER       =
          RETURN                      = gm_return.
    e_document = gm_retmtd.
      CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
          EXPORTING
             WAIT          = 'X'.
         IMPORTING
            RETURN        =
    ENDFUNCTION.

    Hi,
    This piece of code may be helpful.
    Structures for BAPI
      data: gm_header  type bapi2017_gm_head_01.
      data: gm_code    type bapi2017_gm_code.
      data: gm_headret type bapi2017_gm_head_ret.
      data: gm_item    type table of
                       bapi2017_gm_item_create with header line.
      data: gm_return  type bapiret2 occurs 0.
      data: gm_retmtd  type bapi2017_gm_head_ret-mat_doc.
      clear: gm_return, gm_retmtd. refresh gm_return.
    Setup BAPI header data.
      gm_header-pstng_date = sy-datum.
      gm_header-doc_date   = sy-datum.
      gm_code-gm_code      = '06'.                              " MB11
    Write 551 movement to table
      clear gm_item.
      move '551'        to gm_item-move_type     .
      move '000000000040001234' to gm_item-material.
      move '1'        to gm_item-entry_qnt.
      move 'EA'       to gm_item-entry_uom.
      move '0004'     to gm_item-plant.
      move '4000'     to gm_item-stge_loc.
      move '201'      to gm_item-move_reas.
    Determine cost center per plant
      case xresb-werks.
        when '0004'.
          move '0000041430' to gm_item-costcenter.
        when '0006'.
          move '0000041630' to gm_item-costcenter.
        when '0007'.
          move '0000041731' to gm_item-costcenter.
        when '0008'.
          move '0000041830' to gm_item-costcenter.
      endcase.
      append gm_item.
    Call goods movement BAPI
      call function 'BAPI_GOODSMVT_CREATE'
           exporting
                goodsmvt_header  = gm_header
                goodsmvt_code    = gm_code
           importing
                goodsmvt_headret = gm_headret
                materialdocument = gm_retmtd
           tables
                goodsmvt_item    = gm_item
                return           = gm_return.
       call function 'BAPI_TRANSACTION_COMMIT'
           exporting
                wait = 'X'.
    regards,
    keerthi.

  • If SharePoint is good tool for a "customer portal" (general advice needed)?

    Hello, I'm looking for the best solution for my needs and consider SharePont. Please have a look.
    I'm in B2B business and results of my work are files (presentation, pdf, video etc). All files are grouped by "projects". Every project has at least name, number and date. I have not so much customers, let say, 50 and one hundred projects a month
    total.
    What I need is that my customers could log into some web-site with nice interface, find project by name, date or number and download files from the project. 
    Additionally, we use in-house .Net/SQL developed software for operations. I also need to incorporate some data from SQL (like project progress in %%) to the project page.
    Could you point me out if SharePoint is a good solution for this task and how to get started?

    Nikriaz,
    As I understand that you want your client to download files and you want to update project status from sql data.
    If so, then SharePoint foundation will work for you. It has user login, Document management, search, default workflow and web part functionality inbuilt.
    You may need to provision authentication to your client, for their login using passport/live/Hotmail account or so.
    On the other side, if you move to office 365(E1/E2) then passport authentication already taken care of and all functionality will be there except project status data sync with sql unless use enterprise licence
    with BCS.
    Hope these information will help you.
    Please 'propose as answer' if it helped you, also 'vote helpful' if you like this reply.

Maybe you are looking for

  • Mini DisplayPort to HDMI Adapter by Belkin for iMac 2009 failed to work

    This is an SOS message. I have purchased another Mini DisplayPort to HDMI Adapter and I have iMac (27-inc, Late 2009) running OS X Yosemite (Version 10.10.1).  I have carried out the connection of Mini DisplayPort to HDMI adapter to LG Plasma TV Scre

  • Problem with sync

    I synced my PC and my tablet Galaxy NOTE 10.1. But now the only way to chek out the open tabs on the tablet from my PC is by typing "about:sync-tabs". But I can't find out how I can check the tablet's history from my PC. Please help.

  • DNS Address in Solaris 10 java desktop

    Hi I have recently installed solaris 10 I am using Java desktop I configured Ip address I can able to get ping from other pc But I can not start Internet from Solaris 10 Os' So pl. guide How can I provide Gateway and DNS IP address In Solaris 10 Java

  • Mapping Record structure to IDOC (mapping fundamentals)

    Hello, how and WHAT should i map? Because all the time the test in Integration Builder works fine, but at runtime i got an exception during mapping (i dont know why). Left Side (Message Type)           Right Side (IDOC) CMA_MSGTYP1      1..1         

  • How can I update Camera raw for Photoshop CS5 to access images from Canon Rebel T4i?

    How can I update Camera Raw for Photoshop CS5 to access images from Canon Rebel T4i? The updated version of Camera Raw 7 says it only works with CS6. Outside of buying a new Photoshop, is there anything I can do?