For some random reason I thought I'd straight up just answer all of these, so it's a wall'o'text I'm afraid, hope it helps.
Not being able to use your monitor to see resolutions bigger than what it's capable of is not a problem new to 4k, its occurred with screens since forever, adverts on TV for colour TVs when all people had was black and white is a oldschool example, best idea is to go into a retail store and see first hand, or read subjective reviews online comparing them.
The primary benefit of all the additional pixels is really to cram more pixels onto the same area making the pixels smaller, you can sort of get right idea of the quality by simply sitting further back from your screen so the apparant size in your vision is smaller, you can also get a good idea from a lot of smartphones which often do very high resolutions but only across tiny 5-7" screens, the retina devices are a great example.
1) There's many standard resolutions above 1920x1080, namely 1920x1200 which is the 16:10 aspect ratio variant, and there's notably 2 more at 2560x1440 which again a 16:9 aspect ratio (like most TV) and then a 16:10 variant of that at 2560x1600. Then there's 2 versions of 4k which is 3840x2160 and 4096x2160, the former is what most monitors will be displaying, it's the equivelent of 4x 1920x1080 screens stacked in a 2x2 grid.
2) 1440p is just 2560x1440, the names of resolutions like 1080p or 1440p just refer to the 2nd part of the screen resolution which is the number of horizontal rows of pixels or put another way the height of the screen in pixels. It's not considered 4k, 4k is really just a branding term really and theres no 2k branding equivelent for 2560x1440 (1440p), however the resolution is pretty close to being half the total number of pixels (height x width) so in some sense you could consider it a kind of 2k especially relevant for performance (you'd expect to need roughly 2x the processing power to power 4k over 2560x1440 basically)
The increased resolution with gaming makes the picture more sharp, it allows you to see more detail in the scene and discern detail on objects that are smaller or in the distance, if the increase in pixels comes in the form of mostly increased pixel denisty (rather than increased screen size) then there's less need for things like anti-aliasing.
A) There's no real sense in which we have "1080p games", generally speaking most games can run in any screen resolution and if the game engine is built well it will simply render in the higher resolution. For media that is fixed resolution like movies then you can set your video card drivers to behave however you like, that includes running at the real resolution and adding in black borders, stretching to fit the entire monitor, stretching to fit and other subtle variations on this. Typically with games you'd simply run the game in the native resolution of your monitor since stretching to fit the wrong resolution produces very bad picture quality. There is a caveat here, if the resolution is a full multiple of each other you can perfectly scale without image quality loss, it turns out that the consumer version of 4k (3840x2160) is exactly 2x the height and 2x the width of 1920x1080 so this allows for loss-less scaling.
Bigger resolution doesn't always necessarily mean bigger monitor although it frequently does, the quality increase mostly comes from having a higher density of pixels in the same area, so generally speaking what you see is that as resolution goes up on new monitors, the size does too, but the size increases at a slower rate relative to the resolution, which means the higher res panels are indeed slightly bigger but they also have slightly better pixel density. It depends on the panel though, there's not much to manufacturers from making all different combos of size and pixels.
B) Settings preference is really just that, a preference. If you prefer higher resolution with medium settings or lower resolution with high settings, it's completely up to you, it's something subjective that you need to try and guage for yourself. Anecdotally I've swapped between a 30" 2560x1600 panel and a 24" 1920x1080 panel for probably 7+ years now, I enjoy them both for different reasons, I need to sacrifice settings on the 30" panel because the resolution is twice as many pixels so some games take a performance hit, but older games that my video card can handle well look really gorgeous on this kind of panel.
C) Can't really comment, i hear its pretty epic though, I imagine it's similar to when I upgraded from 1680x1050 to 2560x1600 some 7 years ago, it's a very big leap.
3) 2560x1440 tend to come in 27-30" and 2560x1600 tend to come in 30-32", 4k can come in a huge range of sizes some are as small as 32" i think, others are way bigger, essentially like 50-60" TVs. The jump to 4k is a huge one (4x 1080p) and so the range of panel sizes you'll find in production will vary a lot.
4) Simple, preference. G-sync eliminates tearing while maintaining high and smooth frame rates with minimal input lag which has traditionally been a problem for decades in gaming, these people want a solution that will eliminate screen tear, it bothers some people way more than it bothers others. I'm one of the few people who likes screen tearing, each to his own.
A) DX12 is a Microsoft API/standard for windows that allows games to access the video and sound hardware in your PC, you don't need DX12 for 4k. All current/modern VR (Virtual reality) is not dependent on either 4k or DX12, virtual reality headsets are very much like monitors they have a screen which has a specific resolution, none of them are 4k yet but in future this will be likely.
5) Price to performance of new and high end hardware such as 4k monitors and the GPUs you need to power such a high resolution is never good, price to performance ratio of damn near all hardware is always the best in the medium range, you always pay a premium for the latest and greatest until the technology matures years later and something better supersedes it.
6) There is usually some kind of approximate sweet spot for monitors size relative to its resolution, with 1080p it was about 24", some people bought 27" but many people found the pixels to be too big at this size. The problem with increasing screen size is that eventually it becomes too big to use comfortable, it took me a long time to get used to 30" and much bigger would be a waste IMO, too much of the screen sits in your peripheral vision were detail becomes harder to discern.
7) As resolutions get bigger and pixel density increases, anything with a fixed height/width in pixels such as classic fonts will appear smaller on your screen which makes it harder to read for everyone but potentially problematic if you have weak vision. Typically operating systems and applications come with DPI scaling for fonts and zooming for many apps which helps mitigate this problem, however many games do not and the HUD can become harder to read if its not scaled correctly, more adn more these days even games scale HUDs well which is a step in the right direction.
8) How wowed you are by 4k will differ from person to person, but it's very typical with new technology that you don't fully appreciate the benefit of the new tech until you're forced to go back to the old tech for some reason, then suddenly you can't live without it. There is a diminishing returns with things like screen resolution however, our eyes can only discern so much detail, we're no where near that limit yet, but as we approach it we'll appreciate each jump less and less.
A) Most gamers don't use 1440p or 4k, both resolutions are extremely uncommon among gamers, the amount of people using 2560x1440 and 2560x1600 for the 7-8 years they've been available has been very tiny, no more than a few percent, the adoption of 4k is even smaller right now. The cost is the primary factor, not only are the panels very expensive but running games at these resolutions requires extremely high end GPUs, 2560x1600 has required SLI or Xfire for the 7 years I've had it, only now with the 980 can i power this panel with a single card.
9) This was answered in 2.A, in short the video card can scale resolutions up and down, or fit them to the screen however you like.
10) If you can only afford one or the other (resolution or performance) then it's down to your personal preference, some people are happy to game at 30fps with choppy performance with nice graphics, others prefer less graphics and instead a steady 60-120fps.
11) It will eventually make 1080p obsolete, but not for a long time. I predict that everyone will have 4k TVs probably inside a decade and adoption in the PC space will probably be quite high by then, all resolutions eventually become obsolete, no one uses 640x480 or 800x600 anymore.
12) Everyone can see the difference between 60hz and 144hz despite the ignorant claims that still float around on the internet, 144hz monitors can refresh 144 times a second which gives a smoother exerpeince than 60hz providing your GPU can spit out more than 60fps (otherwise its kind of redundant), again how much you care about fast refresh rate depends on your subjective experience, some people love it and will sacrifice graphics others are the opposite.
Response time is a measure of how long it takes a monitor to change the colour of the pixels, its measured in ms (milliseconds or 1,000th of a second). If you have a rapidly changing scene on your monitor and you have a slow pixel response time then the pixels will lag behind the scene and cause an effect called ghosting which looks bad.
There's many different basic panel types, they all have similar performance characteristics. Typically TN panels have fast pixel response times, can run at higher refresh rates, however they have bad viewing angles and often bad colour reproduction. IPS tend to have much longer pixel response times and are limited (outside of botched hacks) to lower refresh rates, however they boast a much superiour colour reproduction and extremely good viewing angles. PVA is one you missed out which often can be found somewhere in between.
13) One is a standard 4k res for home cinema use (3840x2160), much like 1920x1080p it's a home media and broadcast TV standard, the slightly bigger one I believe is a standard for cinemas.
14) As I mentioned before there's nothing to stop anyone from making completely custom resolutions, you're talking about the aspect ratio which is the ratio of the vertical and horizontal, the home cinema and broadcast TV standard aspect ratio is 16:9 which 1920x1080, 2560x1440 and 3840x2160 all conform to. Remember that 1080p simply refers to the vertical resolution of the panel so you can have 2 different 1080p panels each with different widths. Panels that deviate from ratified standards tend not to be very popular since content is quite often developed and targeted for specific standards.
Lastly your off topic section. You're talking about input latency of the TV/monitor, this is measured separately and is independent from both refresh rate and pixel response time, typically latency is caused by TV/monitors having some kind of digital image processing chip inside them which processes the incoming images to make them sharper or cleaner, brighter or more vibrant in some way, quite often if you disable these effects in the menu you can speed up input latency. It's also one of the prime number one reasons that TVs are bad for monitors because they typically have a lot of input latency. Monitors marketed to gamers often boast very small input latencies for those who are bothered.